Intel Joins Georgia Tech in DARPA Program to Mitigate ML Deception Attacks

By on

A recent press release states, “Intel and the Georgia Institute of Technology (Georgia Tech) announced today that they have been selected to lead a Guaranteeing Artificial Intelligence (AI) Robustness against Deception (GARD) program team for the Defense Advanced Research Projects Agency (DARPA). Intel is the prime contractor in this four-year, multimillion-dollar joint effort to improve cybersecurity defenses against deception attacks on machine learning (ML) models… Why It Matters: While rare, adversarial attacks attempt to deceive, alter or corrupt the ML algorithm interpretation of data. As AI and ML models are increasingly incorporated into semi-autonomous and autonomous systems, it is critical to continuously improve the stability, safety and security of unexpected or deceptive interactions. For example, AI misclassifications and misinterpretations at the pixel level could lead to image misinterpretation and mislabeling scenarios, or subtle modifications to real-world objects could confuse AI perception systems. GARD will help AI and ML technologies become better equipped to defend against potential future attacks.”

The release continues, “The Details: Current defense efforts are designed to protect against specific pre-defined adversarial attacks, but remain vulnerable to attacks when tested outside their specified design parameters. GARD intends to approach ML defense differently – by developing broad-based defenses that address the numerous possible attacks in given scenarios that could cause an ML model to misclassify or misinterpret data. Due to its broad architectural footprint and security leadership, Intel is uniquely positioned to help drive innovations in AI and ML technology with a significant stake in the outcome.”

Read more at Business Wire.

Image used under license from Shutterstock.com

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept