Who co-pilots the co-pilots? Why AI needs cloud support

Who co-pilots the co-pilots? Why AI needs cloud support

Source Node: 2675068

Who co-pilots the co-pilots? Why AI needs cloud support
In the last twelve months, we have seen a vast number of new AI organisations develop, taking advantage of the latest advances in foundational models, technology and demand. Although AI is often seen to act as a ‘co-pilot’ rather than an ‘auto-pilot’, there are still many remarkable feats that it can accomplish, compared to classical computing. We have recently seen startups that can offer accurate text-to-sign language, multi-language transcription, and automatic speech video generation with realistic avatars, to name but a few.

However, like all startups and scale-ups, these new organisations face many challenges; some are specific to the AI industry, and others are common to all growth brands. But with the right level of support, founders can flourish, helping to drive the industry – and humanity – forwards.

Who co-pilots the co-pilots? Why AI needs cloud support

High computational power for training AI models

One of the main challenges that AI organisations face is that of training. Training AI models requires a significant amount of computational power, which can be challenging for deep tech companies that tend to operate on an opex, rather than capex basis. Deep learning algorithms, such as neural networks, require a large number of iterations and adjustments to achieve optimal results. This can be time-consuming and costly without access to high-performance computing resources. Furthermore, this data needs to be stored somewhere, and this can be cost-prohibitive to purchase outright and expensive to maintain.

Flexibility in resource allocation and cost management

The resource requirements for training and deploying AI models can vary greatly depending on the complexity of the model and the size of the data set. Like most startups, the direction of the company can change almost overnight, and can be challenging for both people and technology infrastructure. Consequently, most AI startups are cloud-native by default to help pivot to new hardware when things start to move in a different direcion.

Backwards compatibility issues

AI frameworks such as TensorFlow and PyTorch are continually being updated and improved, but a number of these framework iterations have not been backwards compatible with previous versions. This puts significant pressure on organisations to keep up to date with the latest framework, or they risk functionality issues or even downtime. Although users do often expect startups to have teething problems, large amounts of downtime can dramatically erode trust.

With these issues in mind, how have existing, successful AI startups overcome their challenges?

Who co-pilots the co-pilots? Why AI needs cloud support

AI in Practice: OVHcloud empowers Customs Bridge’s essentials

Customs Bridge is a “deep tech” startup that uses artificial intelligence algorithms to create an automatic product classification engine, aimed at European importers. The company’s mission is to create the most reliable product classification engine possible to assign the correct customs code to a product whose description is not fully formalized.

However, Customs Bridge faced significant challenges in training their AI models. They had limited on-premise infrastructure, large-scale data processing requirements, and the need for state-of-the-art AI frameworks. Their existing infrastructure was not sufficient to train and deploy their AI models effectively, and they faced difficulties in accessing and processing large volumes of data required to train their models.

To overcome these challenges, Customs Bridge turned to OVHcloud’s AI & Machine Learning solutions. The team implemented OVHcloud’s model training solution, AI Training, and utilized OVHcloud instances to deploy models into production and support the data power pipeline. This allowed Customs Bridge to process large amounts of data, enhance its AI models, and improve its overall productivity and efficiency.

Customs Bridge was able to leverage OVHcloud’s resources for data enhancement and advanced AI model training. They relied on around 2.5TB of data to train their first Transformers models, and training Transformers on 250,000 lines only took around 30 minutes of computing time, thanks to the NVIDIA V100 GPUs provided by OVHcloud. This was both fast and low-cost, and it allowed Customs Bridge to scale its data volumes without limiting its infrastructure. The cloud-based approach gave the company a lot of freedom to experiment until they found the volume needed to achieve the precision they wanted.

In addition to improved flexibility and scalability for AI model training, Customs Bridge also benefited from cost-effective and efficient resource allocation, simplified implementation and deployment of AI frameworks, and the ability to enable innovation and experimentation for optimal results. By leveraging OVHcloud’s AI & Machine Learning solutions, Customs Bridge was able to overcome its challenges and build an innovative and effective product classification engine.

Elevating deep tech with specialized cloud services

One of the first steps for a growing AI startup is to understand its ecosystem – and not just in terms of understanding the competition. There are many organisations who offer incubators, accelerators and support schemes that can either help directly with mentoring and management assistance, or in the case of the example above, technology infrastructure support.

Cloud services offer flexible resource allocation and cost management, allowing deep tech firms to modify their resources when needs change. This adaptability guarantees that companies pay only for the resources they require, which allows them to allocate their resources more efficiently, and to operate on an opex, rather than capex, basis.

Expandable storage solutions are also an important part of the cloud services model. With these solutions, deep tech companies can process and store large quantities of data, allowing them to train their AI models. These solutions are created to scale easily, ensuring that AI firms can increase their data volumes without any interruption to service – unlike physical storage, where the installation and management of new drives can cause a number of headaches.

Driving the industry forwards

Deep tech AI firms experience many of the same issues as startups across other industries, but also some unique challenges. The vast datasets required to train AI models, for example, come with a corresponding need for high-power compute and storage capabilities, which are often out of reach for young organisations running on seed funding.

This is why many AI companies are cloud-native by default. Cloud allows organisations like these to scale more easily without paying up-front for infrastructure, not to mention benefitting from managed solutions that remove the need for day-to-day management from founders and their teams. However, startups must pay attention when setting up their cloud services agreement and take care to avoid both spiralling and hidden costs; the wrong set-up or the wrong provider – overcharging for ingress / egress costs, for example – can result in a technology burden. But with the right partner, the right solution and a truly collaborative approach, startups can forget about the administrative details and instead focus on their main mission: creating a new world of AI.

Time Stamp:

More from Dataconomy