The article Explosion of flows: what power is needed to supervise them tomorrow?addressed the fact that the parallelization of the calculation inherent in the hardware (network cards and processors) provided a relevant response to the race for rates. But it left open the question of preserving the uniqueness of sessions in this process of parallelization.

Network complexity and encapsulation

The first problem that a network must address is to ensure efficient and fast routing. Thus, to better exploit the opportunities offered by current speeds, operators (TELCO, ISPs, data centers, etc.) use routers with new capabilities such as tunnelling management.Optimized routing enables greater network convergence and virtualization. It then offers an improvement in both the quality of service and the security of the networks by logically separating the network into several entities.

Tunnelling involves inserting protocols to simplify network management by encapsulating data to new routing layers.

What about monitoring quality of service and safety?

Designed before the massive use of tunnelling, the protocol analysis probes used for network monitoring are content to process only simple protocol sequences from the IP layer. They had no particular interest in dealing with low-level protocols.

For these conventional probes, the difficulties arose when it came to processing the protocols located before IP. Indeed, the explosion of tunneling has caused problems of preserving the uniqueness of the sessions during the parallelization of the treatments. In order to get around this problem, conventional probes have developed software and hardware solutions to access the IP layer by bypassing tunnelling. The direct consequence has been to blind them to Layer 2 protocols and take the risk of missing some crucial information.

Getting rid of an analysis at the tunnelling level: what risks?

Being able to analyze low-level protocols and in particular the routing protocol is particularly important to ensure the quality of the data in transit, the good configuration of the equipment and in general, the good health of the network.

Today, it is not possible to fully guarantee quality of service and safety without also supervising the low-level protocols exchanged between the equipment of a network. By ignoring this, the network finds itself at the mercy of the initial configuration of the equipment,the only element of control implemented. If these devices have not been properly configured, or if they have been corrupted by an attack or an unfortunate update, then it is necessary to wait until the consequences on the network are sufficiently visible to see the malfunction.

Dealing with these protocol analysis issues only from ip and not looking at what is happening below level 3 are today becoming errors with serious consequences since they amount to depriving themselves both of a vision of encapsulation (loss of granularity), but also of protocols dedicated to network management.

The “new generation” probes: essential allies for effective supervision

Since trust does not exclude control, it is essential to check the correct configuration of the networks. With probes able toexploit low-level protocols,it then becomes possible to have a real mapping of the routing of the network and not a theoretical mapping aligned with the configurations of the machines. For example, in the event of an anomaly, this type of probe makes it possible to accurately detect which equipment is failing.

It is with this in mind that the NANO Corp teams have designed a new generation of probes that can be installed both at the heart and at the edge of the networks. Unlike market solutions, these probes are not part of a logic of circumvention of the problems encountered but on the contrary in a response of experts to the reality of the stakes.

Leave a comment

Your email address will not be published.