Despite the significant efficiency gains in last few years, the data center industry is seeing growth in rate and severity of datacenter outages, according to Uptime Institute Global Data Center Survey. The enterprises are looking for new ways to boost their resiliency.
Uptime Institute, the global datacenter authority, surveyed around 900 datacenter operators and IT practitioners worldwide to find some of the key trends in data center industry.
Most of the respondents said that hybrid data center approach made their IT operations more resilient, but still 31% of them faced an IT downtime or severe service degradation in the past year, up from 25% over last year’s survey. Moreover, nearly half of the respondents faced a datacenter outage in last three years. Uptime Institute said that this is a higher-than-expected number.
A majority of respondents polled that the outage was preventable. It took one to four hours for most of the respondents to fully recover, while one-third of them reported a recovery time of at least five hours. The main reasons of datacenter outages are human error, configuration mistakes, power outages, network failures, as well as third-party service provider outages.
The questions that arise are—Why are datacenters vulnerable to outages? Why are datacenters so severe? Uptime Institute survey answers these questions and overviews the main challenges and complexities that data center operators face around the world.
Top challenges and complexities for data center managers
Datacenter efficiency improved but complexity has increased
Power usage effectiveness (PUE) has been helping datacenter operators to improve the efficiency of infrastructure energy, resulting in improvement of overall datacenter efficiency. Enterprises indicated continuous improvement in the PUE metric over the years.
According to the Uptime Institute survey, the average PUE was 2.5 in 2007, which improved to 1.98 in 2011, and then 1.65 in 2013. And now, it has reached to 1.58 in 2018. As the usage of artificial intelligence (AI)-driven datacenter management as a service (DMaaS) and software-defined power increases, the enterprises will continue to improve efficiency to lower operating costs and to optimize available power.
Hybrid IT is becoming mainstream as enterprises aims to place their workloads in the best execution venue considering the cost, availability, compliance and other key factors. However, the management of multiple types of datacenter environments increases complexity.
The datacenter managers need to effectively manage the combination of on-premises datacenter capacity and off-premises resources like colocation, cloud, and hosting. This is driving the demand for datacenter infrastructure management (DCIM) software deployments.
To achieve or improve the resiliency, some enterprises count on traditional approaches like regular backups or website replication, while others consider the use of cloud- based high-availability services, and disaster recovery as a service.
Edge computing will add a layer of operational and management complexity
Over 40% of the respondents said that their organization will implement edge computing capabilities so that the necessary data can be processed closer to source of its generation and use. Whereas, 27% didn’t expect the need for edge computing, and 27% respondents were not sure.
While the edge computing technologies like internet of things (IoT) are still in initial stages, but their deployments are being done in large volume. No doubt, most of the organizations will need new edge datacenter capacity in the years ahead.
Of the respondents who indicated implementation of new edge capacity, 37% will use their own private data centers, while 26% will use a combination of colocation and private datacenters. A few respondents also indicated outsourcing or opting for a public cloud provider to handle their edge computing demand.
With the introduction of more edge applications, the edge will expand and move beyond the proximity of their own private facilities and drive the demand for datacenter capacity in multi-tenant facilities, as well as in strategically placed, small distributed, micro-modular datacenters. This will end up adding a new layer of operations and management complexity in the future.
Average rack densities in data centers remain low
In 2017, 67% of enterprises had average rack density below 6 kilowatt (kW) per rack. Only 9% reported an average of 10 kW per rack or higher.
This year, around 50% of respondents reported that their highest rack density was between 10 kW and 29 kW. Only 20% had the highest density 30 kW or above. This shows that density extremes in datacenters are rising.
Such high rack densities at enterprise and service provider datacenters reveals that the operators face cooling challenges. Uptime Institute said that datacenter cooling systems are often poorly optimized, unless the IT environments are specifically built for such high densities.
Most of the respondents use precision air cooling (59%) to cool their highest density racks, followed by basic room-level cooling and liquid cooling techniques.
DCIM has become the norm
Data center infrastructure management (DCIM) software, a powerful and productive technology for modern data centers, provides accurate information about datacenter assets, resource usage, as well as operational status. The DCIM software tracks and reports about redundancy, power utilization, availability, airflow, humidity, etc.
If the enterprises integrate DCIM with IT management data, it can also provide insights about levels of facility infrastructure, echelons of IT stacks, end-to-end IT service management considering the availability and resiliency of datacenter resources.
Despite the difficulties in deployment, the DCIM software has become a mainstream datacenter technology, with 54% of the respondents reported purchase of some sort of commercial DCIM software. 75% of them successfully deployed it, while 47% of them are supplementing the implementation with more DCOM tools.
The key reasons to deploy DCIM software were capacity planning (76%), power monitoring (74%), provide executives and customers visibility or reports (52%) and compliance (35%).
Datacenters unprepared for climate change
The changes in climate can result in extreme and disastrous weather. For example, a warmer climate can increase the frequency of destructive events like flooding rains, droughts, heat waves, etc. This makes the datacenter vulnerable in several ways.
Despite the vulnerabilities, 46% of the respondents said that they were not prepared for climate-change disruption to datacenters. They have determined that they either won’t be affected, or the managers are ignoring the problem.
Uptime Institute said that even slight changes in temperature for only a few days a year can increase the costs of some technologies or make them unviable. Furthermore, the climate changes can affect the well-prepared datacenters by impacting the staff and suppliers with transport and infrastructure.
Recruitment of datacenter skills becoming harder
Organizations are finding it harder to recruit and train staff with the skills needed to operate and support hybrid IT environments. Only 35% of the respondents didn’t face any hiring or staffing issues, whereas the same number of respondents had recent staff cuts or were expecting staff cuts or were finding it difficult to find qualified candidate for open jobs.
By industry vertical, the colocation, multi-tenant data center and software sectors had lower level of staff cuts but found it difficult to recruit qualified candidates for open jobs.
Half of the respondents said that operations and management is the most critical yet difficult area of expertise to hire for. Whereas, around one-third of respondents cited security, networking, electrical engineering, cloud provisioning, and mechanical engineering.
Click here to download full report.
Images source: Uptime Institute