Of all security breaches, 85% are conducted with compromised credentials, often at the administration level or higher. A lot of IT groups think “security” means authentication, authorisation and encryption (AAE), but these are often tick-boxes that rarely stop breaches. The internal threat surfaces of data streams or disk drives in a raidset in a data centre are not the threat surface of interest.
Cyber or Threat organisations must conduct internal investigations of IT, subcontractors and supply chains without implicating the innocent. Therefore, they are organisationally air-gapped from IT. Some surveys indicate up to 10% of IT is under investigation at any given time.
Deploying a signal processing platform, such as Confluent Enterprise, allows organisations to evaluate data as soon as it becomes available enabling them to assess and mitigate risk before it arises. In Cyber or Threat Intelligence, events can be considered signals, and when analysts are hunting for threat actors, these don't appear as a single needle in a haystack, but as a series of needles. In this paradigm, streams of signals aggregate into signatures. This session shows how various sub-systems in Apache Kafka can be used to aggregate, integrate and attribute these signals into signatures of interest.
In this Online Talk you will learn:
Jeffrey Needham, Confluent
Jeffrey Needham works in the Advanced Technologies Group of Confluent specializing in Sensor/Analytic fabrics, and computational event streaming platforms. His history is a smattering of Accumulo, HDFS, Storage Appliances, Continuous Availability, the Oracle RAC kernel and HPC compilers. Jeffrey primarily supports Confluent's US Federal Civilian, DoD and IC teams.