Find products trusted by professionals in your network
See which products are used by connections in your network and those that share similar job titles Sign in to view full insights
Software used to process data continuously as events occur. - Manage, store, and analyze streaming data - Visualize data flow and validate data and its delivery - Extend the reach of existing enterprise assets using connectors to multiple core systems
26 results
Gain immediate analytic insights from real-time data streaming into your organization with SAS Event Stream Processing.
Cribl Edge provides the flexibility and control administrators love from Stream, now running at the edge. Combined with automatic log discovery and metrics production, Cribl Edge empowers developers and operations teams to discover relevant telemetry hidden in unknown and legacy applications that have limited tooling.
Turn big data into better data with Cribl Stream. Cribl Stream is a vendor-agnostic observability pipeline that gives you the flexibility to collect, reduce, enrich, normalize, and route data from any source to any destination within your existing data infrastructure. You’ll finally achieve full control of your data, empowering you to choose how to treat your data to best support your business goals.
The StreamSets platform provides modern data integration capabilities that deliver analytics-ready data through resilient and repeatable pipelines. The StreamSets platform eliminates data integration friction, enables innovation with centralized guardrails, and insulates data pipelines from unexpected change, allowing teams to unlock data - without ceding control - to enable a data driven enterprise.
See which products are used by connections in your network and those that share similar job titles Sign in to view full insights
Real-time data integration built for the cloud. The Striim platform makes it easy to ingest, process, and deliver real-time data across diverse environments in the cloud or on-premise. Striim can ingest real-time data from a variety of sources, including change data from enterprise databases, and rapidly deliver it to on-premise or cloud systems. While the data is moving, it’s easy to process this data, using simple SQL based transformations, to get it into the correct form for the target. Striim provides enterprise grade real-time integration, in a scalable, reliable and secure platform, with real-time monitoring and alerts to provide visibility into your continuous data pipelines. Businesses use Striim’s capabilities to migrate from on-premises databases to cloud environments without downtime, keep cloud-based analytics environments up-to-date, and feed real-time data to their on-premise and cloud data lakes and messaging systems for timely operational decision making.
Hazelcast Platform is a unified real-time data platform that enables companies to act instantly on streaming data. It combines high-performance stream processing capabilities with an enterprise-grade cache to let you easily enrich streaming data with historical context to continuously uncover hidden actionable insights. This unified architecture reduces the number of moving parts in your infrastructure and removes much of the complexity of developing and deploying sophisticated business-critical applications.
Managed Service for Apache Kafka® Apache Kafka® is the leading data streaming technology for large-scale, data-intensive applications. Streamline your Apache Kafka cluster deployment in the cloud to obtain production-ready and fully managed service in just about 10 minutes. Get more than just Kafka with DoubleCloud: - Enhanced monitoring: We offer detailed metrics on topics and replication, alongside real-time infrastructure stats such as CPU, memory, and network usage. - MirrorMaker and S3 sink Kafka connectors: Easily replicate data between clusters using Kafka Connector to MirrorMaker. - x86 and ARM support: Get a 20% price/performance boost and use Apache Kafka on Arm-based instances on AWS with no extra work. - Built-in integrations: Utilize Data Transfer, which comes with over 20 connectors, to transfer data from various sources to Apache Kafka and vice versa. - Schema registry and REST Proxy support.
PortX provides complete control of high-performance observability data pipelines from multiple sources at any scale with a single point and seamless experience, freedom from vendor lock-in, and controlled data routing of only specific data to any destination by need. PortX includes AI/ML log-data collection, automated parsing, pattern detection, aggregation, advanced filtering, masking and encryption, and enrichment. PortX reduces logging costs significantly.