Advertisement

How AI Is Driving Data Speeds in Data Centers

By on
Read more about author Tony Pialis.

Developments in artificial intelligence (AI) technologies have opened up major opportunities and improvements for data processing and analytics. Unfazed and even spurred by the COVID-19 pandemic, complex AI systems saw explosive demand to enable advances in data management, health care, knowledge graphs, and data science. This has led organizations to increase their AI and machine learning budgets by 83% since 2020

As demands grow, the adoption of newer and faster AI technologies can create bottlenecks within data centers as digital infrastructure fails to keep up. Ensuring that data bandwidth, transfer speeds, and memory do not limit the potential of AI is leading hyperscalers to pour billions into data centers. AI technology is often seen as an enhancement to software and hardware, a separate tool without many requirements of its own. Only by recognizing the importance of digital infrastructure can there be a holistic understanding of AI and its future.  

CHECK OUT OUR NEW PODCAST

Tune in weekly to hear different data experts discuss how they built their careers and share tips and tricks for those looking to follow in their footsteps.

AI Technology Will Face Data Bottlenecks

The faster AI becomes, the more strain it puts on our data centers. With the rapid development and implementation of AI chips, there has been an increase in complexity and size when it comes to creating monolithic dies where everything is packed into one single chip. These requirements lead to more wastage as imperfections during the fabrication process produce lower yields, making monolithic AI dies less practical and economical. To reduce wasted silicon, chipmakers have started a chiplet-style approach where multiple smaller chips are connected for equal or greater performance. 

Hyperscalers have chosen to combine multiple AI chips into clusters that rely on the same resources such as memory, storage, and interconnects. By having to share these common resources, interface connectivity risks becoming a key bottleneck that can prevent AI systems from reaching their full performance potential. Software use cases such as machine learning, modeling, and machine vision will also continue to drive a need for greater performance and speeds. As these technologies become more powerful, the demand for data will become even greater and, more crucially, this makes data speeds, bandwidth, and latency essential to the future of AI technology. 

Hardware, the Underappreciated Backdrop

AI chips and software often hold the spotlight for all the potential they promise, but that is done against a backdrop of hardware infrastructure that is often underappreciated. Thousands of components, wires, switches, ports, and more are organized in data centers to connect everything together. It is through improvements in these technologies that can allow for faster data speeds and bandwidth. 

For example, data centers are already looking to replace copper connections with optical technology where light instead of electrical pulses are moving data faster, at a higher bandwidth, and lower latency. Hardware is vital for communication between AI chips as well, with die-to-die interface connectivity supporting the clusters approach. AI does not work in a vacuum – it must be supported by a fast and reliable connectivity infrastructure and only then can AI reach its full potential.

Opening the Door for Future AI Technologies

Not only will hardware infrastructure enable the maximum performance of AI technology today, but it will also open the path for future developments. Hyperscalers are already designing data centers with the future of AI in mind, by offering more room and flexibility for AI systems to expand. In fact, data centers operations will continue to evolve and eventually incorporate AI technology to manage facilities. 

We could see AI tackle IT issues and data management. For example, hospitals that use electronic health records can use AI to manage data records, which can benefit them with cost savings since it takes less employees to oversee this automated function. In the near future, data center infrastructure will be foundational in supporting innovations and AI will have its part to play.

Bridging the Future of Software and Hardware

Most people will never visit a data center and see the infrastructure that powers AI or cloud and all the other technologies that our world relies on. Understanding that innovative breakthroughs must also be supported and powered by hundreds of other vital components brings a greater appreciation to the hardware world behind technologies. Even a simple high-speed cable and port can determine the performance and future of AI. 

Leave a Reply

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept