2
I Use This!
Very Low Activity

News

Analyzed about 14 hours ago. based on code collected about 14 hours ago.
Posted over 3 years ago by Simon Crosby
Brokers don’t run applications - they are a buffer between the real world and an application that analyzes events. An event stream processor (in Apache Kafka/Pulsar terms) or dataflow pipeline is an application that, given the event schema, analyzes the stream continuously to derive insights. 
Posted almost 4 years ago by Simon Crosby
Swim enhances the actor model to support continuous analysis of streaming data from millions of sources in a distributed runtime environment - using Java.  Swim is the easiest way to build applications that continuously analyze streaming data from Apache Kafka.
Posted almost 4 years ago by Simon Crosby
Swim enhances the actor model to support continuous analysis of streaming data from millions of sources in a distributed runtime environment - using Java.  Swim is the easiest way to build applications that continuously analyze streaming data from Apache Kafka.
Posted almost 4 years ago by Simon Crosby
Analyzing data on the fly is tricky: Data sets are unbounded and real-time responses demand fast analysis. Incremental algorithms can be used for statistical analysis, set membership, regression-based learning, and training and prediction of learning algorithms, amongst others. In specific use-cases, domain-specific algorithms also apply.
Posted almost 4 years ago by Simon Crosby
Analyzing data on the fly is tricky: Data sets are unbounded and real-time responses demand fast analysis. Incremental algorithms can be used for statistical analysis, set membership, regression-based learning, and training and prediction ... [More] of learning algorithms, amongst others. In specific use-cases, domain-specific algorithms also apply. [Less]
Posted almost 4 years ago by Simon Crosby
Streaming data contains events that are updates to the states of applications, devices or infrastructure. When choosing an architecture to process events, the role of the broker, such as Apache Kafka or Pulsar, is crucial - it has to scale ... [More] and meet application performance needs - but it's necessarily limited to the data domain.   Even using a stream processing capability such as Kafka streams that triggers computation for events that match rules, leaves an enormous amount of complexity for the developer to manage - and that's all about understanding the state of the system.  Here's why you care about the state of the system and not its raw data: [Less]
Posted almost 4 years ago by Simon Crosby
Streaming data contains events that are updates to the states of applications, devices or infrastructure. When choosing an architecture to process events, the role of the broker, such as Apache Kafka or Pulsar, is crucial - it has to ... [More] scale and meet application performance needs - but it's necessarily limited to the data domain.   Even using a stream processing capability such as Kafka streams that triggers computation for events that match rules, leaves an enormous amount of complexity for the developer to manage - and that's all about understanding the state of the system.  Here's why you care about the state of the system and not its raw data: [Less]
Posted about 4 years ago by Simon Crosby
The rise of event streaming as a new class of enterprise data that demands continuous analysis is uncontroversial.  What’s puzzling is the approaches being taken by the event streaming community to storage of event data and the semantics they seek to achieve for event processing applications. 
Posted about 4 years ago by Simon Crosby
The rise of event streaming as a new class of enterprise data that demands continuous analysis is uncontroversial.  What’s puzzling is the approaches being taken by the event streaming community to storage of event data and the semantics they seek to achieve for event processing applications. 
Posted about 4 years ago by Simon Crosby
One of the most frequently discussed features of 5G, beyond increased bandwidth, is the opportunity for carriers to deliver secure, isolated, QoS-assured network services. A customer can avoid the complexity of a dedicated network or ensure ... [More] that traffic from mobile devices in the hands of employees is isolated cleanly from the Internet. So called "slicing" reduces the risk of cyber-attacks and supports the growth of a distributed workforce. [Less]