Is it possible with the current version of Apache Beam to develop an unbounded source that receives data in a HTTP message? My intention is to run an HTTP Server and to inject the messages received into a Beam Pipeline. If it is possible, can it be done with the existing sources?
Apache Beam HTTP Unbounded Source Python
646 Views Asked by David lara At
1
There are 1 best solutions below
Related Questions in PYTHON
- How to store a date/time in sqlite (or something similar to a date)
- Instagrapi recently showing HTTPError and UnknownError
- How to Retrieve Data from an MySQL Database and Display it in a GUI?
- How to create a regular expression to partition a string that terminates in either ": 45" or ",", without the ": "
- Python Geopandas unable to convert latitude longitude to points
- Influence of Unused FFN on Model Accuracy in PyTorch
- Seeking Python Libraries for Removing Extraneous Characters and Spaces in Text
- Writes to child subprocess.Popen.stdin don't work from within process group?
- Conda has two different python binarys (python and python3) with the same version for a single environment. Why?
- Problem with add new attribute in table with BOTO3 on python
- Can't install packages in python conda environment
- Setting diagonal of a matrix to zero
- List of numbers converted to list of strings to iterate over it. But receiving TypeError messages
- Basic Python Question: Shortening If Statements
- Python and regex, can't understand why some words are left out of the match
Related Questions in HTTP
- Handling both JSON and form values in POST request body with unknown values in Golang
- Why can't I use PUT requests?
- nginx set up reverse proxy from subfolder to a port
- Async Web Server RP2040 returning ERR_CONNECTION_REFUSED?
- Getting `FormatException: Missing extension byte (at offset 6)` exception for accessing `response.body` from a server deployed in Vercel
- Retrieving list of values from MYSQL data base based on input value(LARAVEL 10 )(GET HTTP METHOD)
- Unable to add request headers via CHttpFile - C++/MFC
- Why do we call all http services 'Web Api/Web Service'?
- How to correctly read POST REQUEST body on ESP32?
- on linux gitclone issue remote server error showing fatal error with proxy n port
- Elasticsearch - cascading http inputs from Airflow API
- How to clean the html pages opened in a session?
- UTF-8 is not a valid encoding name
- I dont get the Result i expected when i want to get my Telegram Chatbot id
- NextJS 14 SSE with TransformStream() sending messages in a single response
Related Questions in APACHE-BEAM
- Can anyone explain the output of apache-beam streaming pipeline with Fixed Window of 60 seconds?
- Does Apache Beam's BigQuery IO Support JSON Datatype Fields for Streaming Inserts?
- How to stream data from Pub/Sub to Google BigTable using DataFlow?
- PulsarIO.read() failing with AutoValue_PulsarSourceDescriptor not found
- Reading partitioned parquet files with Apache Beam and Python SDK
- How to create custom metrics with labels (python SDK + Flink Runner)
- Programatically deploying and running beam pipelines on GCP Dataflow
- Is there a ways to speed up beam_sql magic execution?
- NameError: name 'beam' is not defined while running 'Create beam Row-ptransform
- How to pre-build worker container Dataflow? [Insights "SDK worker container image pre-building: can be enabled"]
- Writing to bigquery using apache beam throws error in between
- Beam errors out when using PortableRunner (Flink Runner) – Cannot run program "docker"
- KeyError in Apache Beam while reading from pubSub,'ref_PCollection_PCollection_6'
- Unable to write the file while using windowing for streaming data use to ":" in Windows
- Add a column to an Apache Beam Pcollection in Go
Related Questions in APACHE-BEAM-IO
- Unable to write the file while using windowing for streaming data use to ":" in Windows
- Using elements in a PCollection as table arguments for ReadFromBigQuery BigQuery IO
- Apache Beam/Dataflow Pipeline Scaling Issue
- apache beam on dataflow: WriteToBigQuery doesnt work
- Google Cloud Dataflow handling schema evolution (Addition of columns)
- Can I submit an Apache Beam job using DirectRunner to a local VM
- Apache Beam code to write output in ORC format
- Reading GCS Files from Pcollection of file paths cannot scale to multiple workers in dataflow
- How to specify bigquery schema in apache beam bigqueryIO (typescript sdk)
- Consume kafka topic from Apache Beam
- org.apache.beam.sdk.util.UserCodeException: java.lang.IllegalArgumentException: No filesystem found for scheme gs
- How to use JdbcIO to read a large amount of data from sparksql?
- Is it the only option to use the IAM user for SnowflakeIO Connector & S3 Bucket?
- Is it possible to pass the schema as a side input to beam.io.parquetio.WriteToParquet?
- Apache Beam Publish Kafka Message with KafkaIO and KafkaAvroSerialization for GenericRecord
Related Questions in APACHE-BEAM-KAFKAIO
- Kafka IO in dataflow job is not able read more than 1.5k messages per second with workers equal to the topic partition
- Consume kafka topic from Apache Beam
- Does apache beam processing time, avoid late data?
- GCP Dataflow with KafkaIO: Finding sdk for status channel failed. SDK harness not connected with control channel
- ERROR while connecting to Kafka from Apache Beam on top of Flink
- Apache Beam Kafka IO uses a Single Consumer Thread ignoring Flink Parallelism
- Apache Beam error handling for writing to Kafka
- Capturing deserialization exceptions in KafkaIO
- Apache Beam pipeline reading from Kafka
- Fetch Truststore File Inside a Flex Template image for Confluent Kafka
- ApacheBeam KafkaIO - read messages from unbounded source and terminate pipeline
- ApacheBeeamRunJavaPipelineOperator running Kafka source connection from airflow worker instead of dataflow worker even while using dataflowrunner
- Unable to use KafkaIO with Flink Runner
- Python Apache Beam SDK KafkaIO getting java.lang.RuntimeException: Failed to build transform kafka_read_without_metadata:v1
- How to create a KafkaRecord<String, GenericRecord> in Apache Beam Manually for Unit Tests
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
It is possible. you can develop it by leveraging Splittable DoFn. Source looks like they are going to be depreciated in the near future.
From my end, I am trying to develop such a pipeline that would consume a Rest API that is streaming Json messages in the get's body and supports multiple connections, hence splitting the workload on API side like Adobe Livestream or Twitter. This behaviour should enable scaling on the consumer end (Dataflow)
My struggle is that i can't figure out a splittable restriction out of this use case. The streaming is infinite and there is no Offset like in messaging brokers like Kafka or bytes range (files). I wanted first to build element restriction pairs like: (url,buffered reader) but i don't think buffered readers can be split.
One of the solutions might be not to provide a restriction at all. I am struggling to imagine how the pipeline would distribute elements hence scale.