Is it possible with the current version of Apache Beam to develop an unbounded source that receives data in a HTTP message? My intention is to run an HTTP Server and to inject the messages received into a Beam Pipeline. If it is possible, can it be done with the existing sources?
Apache Beam HTTP Unbounded Source Python
645 Views Asked by David lara At
1
There are 1 best solutions below
Related Questions in PYTHON
- new thread blocks main thread
- Extracting viewCount & SubscriberCount from YouTube API V3 for a given channel, where channelID does not equal userID
- Display images on Django Template Site
- Difference between list() and dict() with generators
- How can I serialize a numpy array while preserving matrix dimensions?
- Protractor did not run properly when using browser.wait, msg: "Wait timed out after XXXms"
- Why is my program adding int as string (4+7 = 47)?
- store numpy array in mysql
- how to omit the less frequent words from a dictionary in python?
- Update a text file with ( new words+ \n ) after the words is appended into a list
- python how to write list of lists to file
- Removing URL features from tokens in NLTK
- Optimizing for Social Leaderboards
- Python : Get size of string in bytes
- What is the code of the sorted function?
Related Questions in HTTP
- My get request for http is very slow
- Angular multiple http requests chrome android
- HttpRequestContext vs HttpContext
- Converting curl command to iOS
- getting google contacts using shuttlecloud
- Node.js http.get example
- How can hide url value in php
- Symfony2 - handle HTTP/Entity user access restrictions
- Angular http interceptor responseError doesn't have statusText
- Which of the following hostnames are valid?
- Send Http request at specific time
- Rails - read file from POST request / octet-stream
- Python - Cookies & BeautifulSoup
- Npm requests stopped by home router
- POST Android json data
Related Questions in APACHE-BEAM
- Api for video processing with Apache beam
- Reading CSV header with Dataflow
- BigqueryIO Unable to Write to Date-Partitioned Table
- Azure Blob support in Apache Beam?
- Consuming unbounded data in windows with default trigger
- How to get a list of elements out of a PCollection in Google Dataflow and use it in the pipeline to loop Write Transforms?
- Read a file from GCS in Apache Beam
- Reading and Writing XML files through Apache Beam/Google Cloud DataFlow
- Multiple file generation while writing to XML through Apache Beam
- Unable to serialize com.google.api.services.bigquery.Bigquery$Tables
- Apache Beam Dataflow Jobs started failing with: Workflow failed
- What is a single bar in python?
- Download location for apache_beam.io.gcp.gcsio.GcsBufferedReader object
- Processing Total Ordering of Events By Key using Apache Beam
- Pick elements in processElement() - Apache Beam
Related Questions in APACHE-BEAM-IO
- Chaining another transform after DataStoreIO.Write
- KeyError on passing PCollection as side input on Apache Beam
- Unable to access PCollection outside with block
- Is there a way to completely swap out the way serialization is handled with Apache Beam?
- Apache Beam - what are the limits of Deduplication function
- Apache Beam Python SDK - Reading from Postgres using JDBC io
- How to catch exception or ACK pubsub message in Google dataflow PubsubIO.write() method in case of non existing pubsub topic?
- IllegalArgumentException on apache beam job side Input
- How to use CombineFn to merge windowed PCollection of KV<String, String> to List<KV<String, String>>
- Is there an Apache Beam function to gather a fixed number of elements?
- Listing files in GCS with apache beam low throughput
- Writing DeferredDataFrame to CloudSQL for PostgreSQL
- Apache beam java MongoDbIO sink/upsert opertation not preserving the given field order
- BigQueryIO Batch pipeline with STORAGE_API_WRITE doesn't truncate table
- Apache Beam Publish Kafka Message with KafkaIO and KafkaAvroSerialization for GenericRecord
Related Questions in APACHE-BEAM-KAFKAIO
- Apache Beam Kafka IO uses a Single Consumer Thread ignoring Flink Parallelism
- ERROR while connecting to Kafka from Apache Beam on top of Flink
- GCP Dataflow with KafkaIO: Finding sdk for status channel failed. SDK harness not connected with control channel
- commitOffsetsInFinalize() and checkmarks in Apache Beam
- Estimating Watermark for Event Time in Beam
- KafkaIO GroupId after restart
- Apache Beam HTTP Unbounded Source Python
- Apache Beam Kafka Source Connector Idle Partition Issue with "CustomTimeStampPolicyWithLimitedDelay"
- KafkaIO - Different behaviors for enable.auto.commit set to true and commitOffsetsInFinalize when used with groupId
- GCP Dataflow Kafka (as Azure Event Hub) -> Big Query
- Apache Beam KafkaIO - Write to Multiple Topics
- Apache Beam KafkaIO Reader & Writer - Error handling and Retry mechanism
- The RemoteEnvironment cannot be used when submitting a program through a client, or running in a TestEnvironment context
- Apache Beam KafkaIO consumers in consumer group getting assigned unique group id
- Apache Beam ReadFromKafka vs KafkaConsume
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
It is possible. you can develop it by leveraging Splittable DoFn. Source looks like they are going to be depreciated in the near future.
From my end, I am trying to develop such a pipeline that would consume a Rest API that is streaming Json messages in the get's body and supports multiple connections, hence splitting the workload on API side like Adobe Livestream or Twitter. This behaviour should enable scaling on the consumer end (Dataflow)
My struggle is that i can't figure out a splittable restriction out of this use case. The streaming is infinite and there is no Offset like in messaging brokers like Kafka or bytes range (files). I wanted first to build element restriction pairs like: (url,buffered reader) but i don't think buffered readers can be split.
One of the solutions might be not to provide a restriction at all. I am struggling to imagine how the pipeline would distribute elements hence scale.