While the cost for 10GbE is coming down and adoption is rapidly rising, there remain challenges in analyzing 10GbE traffic, most notably because the industry has yet to achieve real-time analysis at 10GbE line rates. However, 10GbE analysis is available and does not have to be limiting in terms of results. Below are six questions that will help determine if your organization is fully optimized for analyzing 10GbE traffic.
1. Are you being specific enough?
It's important to know exactly what you want to capture and what information is going to be most beneficial for your analysis. Your requirements will likely vary between each network segment and odds are you are going to have to capture data at several locations. An excellent way to analyze 10GbE traffic, especially when utilization is high, is to use post-capture analysis and only save the data to a disk in real-time. Trying to capture and analyze simultaneously, in real-time, on highly utilized network segments puts much more strain on the system than if you just save data to a disk for post-capture analysis.
2. Do you REALLY know your network?
Knowing how you expect the network to be performing is all the more critical when trying to analyze highly utilized 10GbE segments. If you're already embroiled in a complex network analysis firefight it's too late to realize that your ability to assess "normal" conditions on the network may be lacking. To get a sense of "normal" conditions before trouble arises, you should perform and archive baseline measurements across specific network traffic like HTTP and key business applications over typical cycles - like an hour, a day, and a week, for the network as a whole. Other metrics to consider include understanding packets size distribution as well as protocol and node usage over time, uncovering cycles in these metrics, which provide a "fingerprint" of your utilization. That way you will always have a clear view of the network for comparison when trouble arises. Only after convincing yourself that the basic data is in place and being collected and analyzed should you embark on detailed analysis and drill-down of packet-level data.
3. Are you sticking to the essentials?
The temptation is to try to capture and analyze everything, especially when the source of the problem is not immediately known. But quite often certain conditions can be immediately ruled out, and using these clues to limit the collection and analysis to only what is necessary dramatically improves network analysis performance. You always have the option to customize analysis by turning off modules that are not important to the current exercise. Modules such as wireless network performance can be turned off, especially in 10GbE analysis, because odds are they are not relevant to the problem being investigated. The key is to customize your usage and take advantage of it.
4. Do you know your limits?
Even after analysis has been streamlined to only essential areas of the network, data capture for network analysis on 10GbE networks generates a great deal of data quickly, and managing the data becomes a significant challenge. Regardless of the system used, the data is typically stored for subsequent retrieval and post-capture analysis. The two most common formats are standard packet files and databases. In either case, two metrics to manage closely are file size and frequency of disk writes. Though intuition may lead you to think that the larger the file size the better, this is often not the case as very large files require very large memory footprints to open. If the files are too large they will be unworkable on the computer being used for analysis. Smaller files, however, typically lead to more frequent disk writes, and this can rob the system of precious resources for performing the actual packet capture. Optimum performance is achieved with a balance of these two demands, and this is different depending on the hardware resources available. One rule of thumb to keep in mind is that if files are being created every 30 seconds or less, it's going to increase strain on achieving the maximum packet capture rate significantly. Starting with reasonable sized buffers and files makes all the difference. We recommend that you start with 256MB buffer for packet capture and 128MB for files to be created. After a few captures you'll quickly determine if either of these parameters can be better optimized for your system. Also, try to use the lowest number of simultaneous captures as possible. In several systems, you're allowed to create as many captures as you want, but you need to remember that for each capture you open more memory is reserved for buffering and less is available for data processing.
5. Are you filtering and slicing?
Filtering is a way of limiting the overall number of packets captured and stored based upon user-specified criteria. Slicing captures and stores all of the packets, but it truncates the packets after a certain length, typically allowing you to store the header information but slice off the payloads. In both cases the same result is achieved, the overall amount of data to store is significantly reduced, freeing up more processing power for capture and analysis and more disk space for storing the data that's truly important to the current analysis task.
6. Are you being reasonable?
Most network analysis systems allow multiple users to connect to the hardware performing critical network data capture and analysis tasks. Put a limit on users. Nominate an owner for each system that will monitor filters and captures. Make sure it's understood who has the authority to go kill a capture. Too many users with too many options is a recipe for disaster. You can always scale with additional systems if needed.
No comments:
Post a Comment