Appen Training Data Solution Unveils Enhancements to Accelerate Customers’ AI Initiatives

By on

A new press release reports, “Appen Limited, the leading provider of datasets used by companies and governments to train AI systems quickly and at scale, today introduced feature updates for its AI training data solution designed to accelerate customers’ artificial intelligence initiatives. The Appen platform – already the most comprehensive solution for collecting and labeling images, text, speech, audio, and video – combines Figure Eight’s machine learning (ML)-enabled annotation tools and self-serve client workspaces with Appen Connect to oversee Appen’s global multilingual crowd of more than 1 million skilled contractors, and a wide range of managed services to ensure delivery of high-quality training data at scale, with the speed and security required by customers. The feature enhancements today announced include: ML-Assisted Text Annotation, ML-Assisted Text Utterance Collection, Enterprise Analytics.”

The release goes on, “Available for multiple use cases, these feature updates—together with the platform’s ML-Assisted Video Object Tracking using Dots, Lines, and Polygons capability—further cement Appen’s unique ability to deliver on the increasing volume, quality, and speed requirements for training data to support the world’s most innovative AI systems. ‘AI can’t function and improve without a constant stream of large volumes of high-quality training data, a market that will be worth up to $19 billion – 10% of the overall AI market – by 2025,’ said Appen CTO Wilson Pang. ‘To ensure our customers can continue to develop accurate, powerful AI products and services, we are constantly enhancing our solution with new features to help them meet their data needs today and into the future’.”

Read more at PR Newswire.

Image used under license from Shutterstock.com

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept