Blind Spots in Data Center Monitoring

Blind Spots in Data Center Monitoring

What is a Data Center and Why it Matters?

Data Center is a place where a large number of data and information [computer servers] are stored, managed and disseminated. Data centers are huge information and data repositories, both virtually and literally.

An advanced data center could occupy a large land area to house its different facilities: managing such is time and energy-consuming.

To save space, servers are made thinner and stacked in a rack. These computers are powerful tools that consume high power, thus they require a constant and reliable cooling system to make sure they don’t overheat.

Organizations need data centers to make their operations systematic and functional. For instance, when you connect to the internet, and go to a website, you are actually connecting to a data center. Google and Facebook hold one of the largest data centers in the world to ensure the reliability of its services.

Due to its critical role in making uninterrupted and reliable services possible, Data Centre requires 24/7 monitoring and maintenance.

What is a Data Center Monitoring?

Data Center Monitoring ensures the health of a data center. It is simply the process of monitoring, managing and operating the data center. The purpose of monitoring a data center is to ensure that the pertinent processes and functions are working effectively and without interruption.

To manage and monitor this data-monster-centers, a breakthrough was born in the name of Data Center Infrastructure Management or DCIM.

DCIM is a tool for efficient and effective monitoring and managing of data center. It can automatically extract real-time energy usage, accurately display overall trending information, and zoom into the details of the health of data centers.

Simply put, DCIM acts as an all-seeing eye up to a minute detail of Data Center’s status and performance.

Most Common Issues in Data Center Monitoring

  1. High Cost Energy Consumption. Data centers are vicious energy consumers. An advanced data center facility can actually consume power comparable to a small town. Could you believe that U.S data centers are powered by over 34 500 mW power plants? That’s a staggering $9 Bn of power bill.
  1. Network Congestion. With the advent of virtualized machines or VMs, servers are being overwhelmed with its bandwidth consumption. This problem can result to network error and can even crash a network latency sensitive workloads.
  1. System Failure.  This is the nightmare of every data center operations. Even an unattended temperature fluctuation can lead to system failure. Hence, real-time and detailed monitoring of data centers is essential to prevent threats to system failures.

3 Blind Spots in Data Center Monitoring

  • Environmental Monitoring

Designing and monitoring the physical environment of data centers is vital for its optimum performance. The temperature, humidity and airflow inside the data center are always under surveillance in order to avoid overheating and other environmental discrepancies that can hamper its smooth performance.

But unfortunately, there is an insufficiency in tools in aggregating server temperature and power data which could cause undesirable effects on the overall performance of the system.

According to Jeff Klaus, a Data Center infrastructure expert, more than ever, the focus on new measurements and improvement in the aggregation of server temperature and power data are imperative.

  1. Security in Virtualization

According to  Bernard Golden, “virtualization is an abstracted view of underlying physical devices. This allows multiple physical assets to be combined and presented to servers and applications as if they were a single, larger asset.”

It is cost-efficient because it reduces the number of servers you have to run. In the olden days, workloads are done through a ‘one-is-to-one’ server, but now with virtualization, administrators can run up to 10 applications in one server, therefore reducing work needed for system administration.

However it must not be overlooked that equally, virtualization increases risk: in a single blow, up to 10 applications can be affected.

Moreover, virtualization is always at the threshold of failure for the reason that security strategies are not keeping pace with the innovative steps of virtualization. IT pros tend to think that introducing virtualization doesn’t require a new scheme of security. This is a big mistake: as more workloads are run in a single server, it also requires an increase in security measures to keep the applications safe.

Gartner’s Neil MacDonald said, in doing a virtualization, “The first and most important thing to do is to establish basic security hygiene around the configuration and vulnerability management of it”.

  1. Data Smuggling

Organizations are unconsciously doing a  bloodletting–always blinded of letting data drip from the data center. One of the many ways that data are smuggled from the center is through DNS tunneling. DNS tunneling was once a way of infiltrating internet connection by way of bypassing the portal or firewall from blocking unauthorized access. But there came a trend of using this method in extracting data from the server, which by all means alarming and destructive .

Most organization leave DNS outbound from the data center open. And we can’t blame them because basically,  every system on the network needs to have DNS calls. Data compromise happens mostly undetected.

As Brian McHenry a Security Solutions Architect said:

“…these anomalies are often not detected until they reach the outbound DNS caching resolver. The DNS caching resolver forwards requests on behalf of other systems and serves up cached responses to reduce the outbound DNS traffic volume. The problem here is attribution and tracing the indicators of compromise (IOC) back to the source, as most caching resolvers do not log much detail about which source IP requested what name resolution.”

Worst of all, there is no existing security suite that can prevent data smuggling via DNS tunneling. And as R. Langston perfectly puts it, “..next generation firewalls, with all their deep packet inspection and application identification, can’t tell you if there’s data going out via DNS.”

Conclusion

There is no better way of troubleshooting these blind spots other than a well planned and constructed monitoring and management facility for data centers. A streamlined support for facilities, network and IT is imperative to this end.

About the Author, Micah de Jesus

Micah de Jesus is a Digital Marketing Professional and a data / cybersecurity news junkie. She works as the Managing Director of GrowthScout SEO, a digital marketing firm and a Contributing Editor for INOC – NOC Center, The Partnership UK, Turrem Data and Solus Multi Factor Authentication.

Related posts