Demand for data processing centers during coronavirus – Kingston Technology

Demand for data processing centers during coronavirus – Kingston Technology

What are the requirements for data processing centers in these unprecedented times? Read this article from an expert of the industry, Dr. Sally IVZ, which will provide you with an understanding of these requirements.

Demand for data processing centers during coronavirus

Photo of Professor Sally IVZ

Professor Sally IVZ – Chairman of the organization Cyber ​​Trust and senior adviser to the Global Foundation for Cyber ​​Studies and Research Global Foundation and Research). Called the “torch -bearer of ethical technologies”, it became the first laureate of the Frontier Technology and Social impact (advanced technology and social influence) awarded to the UN. Sally is a technical director, and now a professor in advanced technologies and a global strategic adviser in modern technologies. She was awarded awards as an international author, leading, speaker on basic issues and an authoritative examination examination (artificial intelligence, 5G, cloud technologies, blockchain, cybersecurity, management, Internet of things, theory of analysis and data processing) along with culture) , skills, Dei, sustainable development and social influence.

Sally is actively engaged in training and mentoring, supporting a new generation of technical specialists, and founded the organization of Aspirational Futures to increase inclusive, diversity and equality in education and technology. Its next book Tech for Good (technology for good) is preparing to release in 2022 Sally invariably receives a recognition for global influence in the technological space with such leading organizations as Onalytica, and is one of the ten world leaders in many disciplines: from artificial intelligence up to 5G, sustainable development, etc.

Now that you need to stay at home, the data are of paramount importance

SSD installed on the server

In this unprecedented period of universal uncertainty, both speed and the scope of how we work, study and live. To be prepared for surprises lies at the basis of the data centers industry, the “invisible” operations of which were classified in most countries of the world as “vital services”. Affecting the organization of all sizes, in each industry, Koronavirus influenced the continuity of business processes throughout the economy. To nullify the problem and the problem, the optimization of existing services, taking into account the new requirements, becomes strategic and operational necessity of data processing centers. There are two key factors that catalyze massive demand. Firstly, this is the need for computing ability, due to the large-scale transition of many enterprises and institutions to work from the house. Digital infrastructure has never been so important for the global economy. The associated use of digital applications in video communications, telemedicine, e -commerce and electronic training along with entertainment, since we all spend more time in the room, causes an increase in the need for data volumes.

A man standing near the server rack with a laptop

Demand for the throughput of data processing centers beats new records

The recent publication of 1 Kingston company on the growth of data centers emphasized our “insatiable need” in the data, which was further contributed to the emergence of 5G technology, the Internet of things and boundary calculations. Having placed this in the context of the coronavirus, on March 11, 2020, the provider of the services of the Datups Center Deutsche Commercial Internet Exchange set in Frankfurt a new world record 2 – the throughput of the data transmission channel above 9.1 tees per second. Vodafone announced an increase in data of data by 50 % 3 in some markets with a significant increase in the use of Internet providers. Moreover, BT removed the restrictions on the house -wide data transfer 4 to maintain work, study and life needs. While the monitoring of our “Internet-stratum” really reflects the upward trend of disconnection 5, their number did not reach a level that would correspond to such an unprecedented increase in traffic. In addition, despite the current growth and predicted annual increase in traffic and working loads of data centers, it is important to note that the global demand for energy in data processing centers, according to the forecast of the International Energy Agency – IEA, will decrease 6. This is due to both the practice of consolidation, and with the practice of increasing efficiency, especially cooling systems, as well as with the advent of some new hypers of data processing centers, which will operate 100 % renewable energy.

High -performance calculations allow research to combat the virus

Disks for disks in a rack with green LEDs

The second determining factor is the need for speed and computing power, and it is here that HPC (High Performance Computing – high -performance calculations) are in the foreground). Studies of vaccines and treatment methods generate large data sets. Research institutions and pharmaceutical companies for the first time use HPC systems on a scale for modeling data and calculations in areas such as epidemiology and bioinformatics. This can help significantly reduce the development of new drugs.

The consortium of high -performance computing for studying coronavirus 7 combines leaders in the field of technology, government and scientific circles and provides more than 330 computing speed Petflops, 775,000 cores of central processors and 34,000 graphic processors. In an effort to better understand the virus and develop treatment methods that can be used for potential vaccines, now you can perform incredible 330 trillion operations with a floating comma per second. This is also a great example of prioritizing cooperation on competition between participants such as IBM, Amazon, Microsoft, HPE and Google.

Although often hidden in plain sight, data centers are both the powerhouses of cloud computing and the connectivity elements of the Internet, fulfilling the triple need for more speed, more flexibility and optimal availability. Operators are working hard to build resiliency to minimize the risk of downtime and service interruption for their various user groups. With Gartner estimating the average cost of IT downtime at around $5,600 per minute, it's no surprise that this is the leading IT issue that worries IT executives. Just as retail consumers are concerned about stocking up on food and household supplies, symbolized by the worldwide stockpiling of toilet paper 8 , data center customers are seeking additional capacity and throughput to meet both rapidly growing demand and business continuity protection.

Managing the virtual virtually

City landscape with a luminous schematic pattern of microchip

Transparency is critical to both reassure and build trust, which is why data center operators are publicly sharing established criteria to manage and, where appropriate, prioritize new cloud service capabilities to protect mission-critical operations 9 . Substantial support is also provided to existing clients. To do this, most ISPs introduce short-term measures such as pausing bandwidth versus operating costs. For many SMB customers who have demanded network traffic and bandwidth that no one expected or planned for, the ability to increase system loads and networks beyond any existing CDR and up to their maximum port bandwidth without penalty can be value in ensuring the viability of their business during this turbulent time.

But ensuring business continuity for customers and partners also requires business continuity for the data centers themselves, especially when it comes to protecting staff, partners, and suppliers. One of the key areas supporting this advocacy has been the level of knowledge sharing within the sector, especially on health, safety and wellbeing, as well as on HR and supply chain issues. It also involved a two-way channel with the government. For example, in the UK, the DCMS (Department for Digital, Culture, Media and Sport) Data Infrastructure Resiliency Team was set up to ensure that when key policy decisions are made, data centers are also will be taken into account.

The most important thing is that this means limiting the ways of infection and the fight against the reality of the closure of objects on quarantine. This required restrictions on access to objects and the introduction of a dosed operating mode of personnel in compliance with the practice of social distance. At the same time, related problems of accessibility arose, first of all, the reduction of personnel present on the spot, and the absence of key personnel due to illness or self-isolation.

Especially important for reducing risks is through control of critical objects, so it is extremely necessary to provide the possibility of remote management. The combination of sensors technology, support services to eliminate problems, such as “skillful hands” 10, and systemic monitoring in key activities, especially the distribution of power and temperature, provides an increased level of remote visibility in real time. And at the same time, there is an opportunity for advanced levels of data analysis and forecasting potential incidents. Indeed, along with the satisfaction of the needs for computing, speed, reliability and power, coronavirus also put forward how much the data center can be remotely controlled. Moreover, the role of data processing centers has now noticeably moved to the center of attention in areas such as work, study, communication and entertainment, as well as for a better understanding of the virus and supporting interventions in treatment.