Big Data Analytics and Machine Learning
Our expertise is born out of a passion for innovation
and decades of hands-on experience designing and building
systems for some of the most demanding customers in the world including the
National Security Agency (NSA) and the New York Stock Exchange (NYSE).
Our team has worked extensively with major vendors to build custom solutions and have also
built high performance proprietary systems from the ground up.
We know the major vendors’ strengths and weaknesses.
Absent guidance and requirements we take a best-of-breed approach when designing a system.
Big Data
Big Data is often referred to as the three V’s
(high Volume, high Variety, and high Velocity) data.
The Internet of Things is adding a fourth dimension we have termed
“high Vector” to account for the increasing number of sensors, devices and access
points.
Some examples of Big Data include network traffic, market data,
social media and commerce transactions. The challenges include capture,
storage, search, sharing, control, monitoring, analysis and visualization.
The N2 team has extensive experience designing and developing highly
specialized Big Data systems as well as integrated solutions leveraging popular
commercial products and technologies.
The Big Data product landscape is vast and can be overwhelming. Our general approach is to
leverage best-of-breed technologies for the customer's specific objectives
and develop custom components when no other alternative exists.
Show More>>>
Advanced Analytics
Advanced analytics enables the discovery of
patterns and trends in structured and unstructured data,
and uses this insight to predict the outcomes of future events and interactions.
Analytics is typically performed by algorithms and machines whereas Analysis is something humans do.
In practice, there is extensive use of mathematics and statistics,
the use of descriptive techniques, predictive modeling, machine learning,
and neural networks to gain valuable knowledge from data.
The spectrum of data being analyzed spans from highly structured relational
to unstructured data, and may be at rest or streaming.
Analytics is diverse and may be used to predict the
likelihood of events based on changes in social networks and conversations,
a threat level based on correlated cybersecurity events across systems and networks,
or the probability of making money in the stock market.
Show More>>>
Cloud Computing
As defined by the National Institute of Standards and Technology (NIST),
Cloud Computing is a model for enabling ubiquitous,
convenient, on-demand network access to a shared pool of configurable computing resources
(e.g., networks, servers, storage, applications, and services)
that can be rapidly provisioned and released with minimal management
effort or service provider interaction.
The N2 team has architected and deployed public,
private and hybrid clouds as well managed and supported mission critical applications
and services in the cloud.
The N2 team has worked extensively with all major public cloud providers:
Amazon Web Services (AWS), Google Cloud Platform, Microsoft Azure, and Rackspace.
Show More>>>
Distributed Systems
The ability to divide and conquer is paramount in the processing of Big Data.
This is generally accomplished through sharding and using distributed
systems that leverage many computers operating in parallel.
A distributed system is a software system in which components located on networked computers
communicate and coordinate their actions by passing messages.
The distributed components interact with each other in order to achieve a common goal.
Distributed computing also refers to the use of distributed systems to solve computational problems.
In distributed computing, a problem is divided into many tasks,
each of which is solved by one or more computers,
which communicate with each other by message passing.
Distributed Systems create challenges related to process orchestration,
concurrent computing / task execution, and data locality.
The N2 team has developed frameworks for simplifying programming of distributed processes and concurrent execution.
An integral part of the monitoring and management of execution is
status and metric reporting with control mechanisms to optimize performance and availability.
Show More>>>
Real-Time Computing
Real-time computing (RTC), or reactive computing, comprise
hardware and software systems that are subject to a
“real-time constraint”, for example operational
deadlines from event to system response. Real-time programs must
guarantee response within strict time constraints, often referred
to as “deadlines”.
Real-time responses are
often understood to be in the order of milliseconds, and
sometimes microseconds. Conversely, a system without real-time
facilities, cannot guarantee a response within any timeframe
(regardless of actual or expected response times). A real-time
system is one which controls an environment by receiving data,
processing them, and returning the results sufficiently quickly
to affect the environment at that time. A real-time system may be
one where its application can be considered (within context) to
be mission critical. High Frequency Trading (HFT) systems are an
example of a real-time computing system – the real-time
constraint in this system is the time in which a trade needs to
be executed to prevent loss or make a profit. Real-time
computations can be said to have failed if they are not completed
before their deadline, where their deadline is relative to an
event. A real-time deadline must be met, regardless of system
load. An Active Cyber Defense (ACD) system is another example of
a real-time system that detects IDS and IPS events and responds
to mitigate or control the activities of the attacker within a
certain timeframe.
Show More>>>
Microservices and Serverless
Applications built around Microservices and serverless are replacing monolithic
architectures with loose coupling, greater modularity, portability and reduced dependencies
in mind. This enable organizations to be more agile and adapt faster at Web Scale.
With decades of experience, our approach is to architect systems
that scale horizontally with embedded control, monitoring and analytics. We implement these systems
based on reactive design patterns using frameworks which are integral to the whole system and not afterthoughts.
Using a concept N2 has coined “Applastic Computing”,
these applications are in control of their own scaling and can run in public,
private or hybrid cloud environments.
Show More>>>