5 tips for building foundation models for AI

imaginima/Getty

Many businesses are just beginning to grapple with the impact of artificial intelligence, but some have been using machine learning (ML) and other emerging technologies for over a decade.

Also: Most AI projects are abandoned – 5 ways to ensure your data efforts succeed

For Manish Jethwa, CTO at Ordnance Survey (OS), the UK’s national mapping service, the priority is to combine his organization’s AI and ML experiences with recent advances in generative AI to refine, distribute, and apply its treasure troves of data.

Jethwa explained to ZDNET how language models (LLMs) are helping OS users find and query geospatial data. One of the key elements here is the organization’s foundation models for AI, which serve as a base for building more specialized applications.

Also: 4 questions to ask yourself before betting on AI in your business – and why

While tech analysts like Gartner suggest there is much Jethwa’s team at OS combines foundation models with commercially-available tools to exploit and distribute geospatial information.

Here’s five key lessons business leaders can take away from Jethwa’s deployment of AI foundation models.

1. Develop a strong case for use

Jethwa stated that OS is developing foundational models to extract environmental characteristics for analysis in a way that is sensitive to copyright. He said. OS has a long-standing history of high precision data collection, which feeds its AI developments.

“Where we’re trying to extract features, we build foundation models from the ground up,” He said. “That will be a model where we’re defining the full training set with the labelled data that we’ve got internally.”

See also: 4 ways to use AI as a competitive advantage for your business

Foundation models are used in other areas of data analysis. Jethwa says the message is simple: You can use what you have already built over and over again. He said

“The foundation models are there to help us build subsequent output. So, if we then wanted to learn about roof materials or green spaces or biodiversity, we could do all of that from the same foundation model,” . “Rather than having to train multiple foundation models, you just do the fine-tuning at the end. This process allows us to connect to the problem we’re trying to solve with source data.”

2. Establish methods with a purpose

Jethwa says that focused training can help to reduce costs when building foundations models. He said. “The execution of these models takes far less energy and resources than the actual training.”

OS feeds its models training data in chunks. He said. “You have to curate data across the country with a wide variety of classes that you’re trying to learn from, so a different mix between urban and rural, and more.”

See also: 5 ways to become a great AI manager, according business leaders.

First the organisation builds a small AI model using several hundred examples. This approach helps to limit costs and ensure OS is heading in the right directions.

“Then we slowly build up that labelled set,” Jethwa said. Although the organization’s smaller models are impressive, the results are still impressive. He said. “The models might solve a wider variety of problems, but, for our specific domain, we outperform those models, even at a smaller scale.”

3. Use other LLMs to fine-tune

Just as OS uses its foundation models does not mean that the organisation ignores large language models well-known, said Jethwa. “We’re building off the existing models and doing the fine-tuning based on our documentation.”

OS utilizes the full breadth commercially available LLMs. Microsoft shops use Azure machine learning models and Python-based tools.

ChatGPT will now do your work.

Jethwa stated that OS is also exploring partnerships with external organizations such as IBM and technology suppliers to create collaborative solutions to data driven challenges.

As with foundation models, it is important to keep costs down.

“It’s an effort to rationalize,” Jethwa said. “Internally, the main way of taking that approach is by building up slowly and ensuring the destination you’re trying to head towards is achievable, and you’re not wasting resources with fruitless activity.”

4. Consider commercialization

Could these technologies be used or sold by other organizations now that OS has begun to build and refine their foundation models? Jethwa said that the answer is yes.

Crown copyright is a form copyright that covers assets created by UK employees in the public sector. he said. Also: Microsoft is saving billions with AI and laying thousands off – where are we going from here?

Jethwa said that when OS provides open access, the assets of the organization must not be collected and monetized unless they produce benefits for UK tax payers.

“We’re trying to protect our data as much as possible, but at the same time, deliver as much value for the UK. So, it’s trying to get that balance right, which is a challenge.”

5. Keep an eye on the future.

Jethwa stated that his organization’s foundation models have proven the benefits generative AI has for providing access to in-depth insights. He painted a picture about how the OS’s approach to AI could develop over the next decade. Also: According to Amazon, AI is already enhancing 5 entry-level tech positions.

Jethwa stated that the key to success in the long term is to use APIs and data for creating definitive answers to prompts, using trusted sources including OS information along with external sources. He said. “You want to know where the actual schools are. AI has to translate a true request, going back to an authoritative source, which OS is, and we can pull the data and deliver the output.”

Receive the top stories of the day in your inbox every morning with ourTech Today Newsletter.

Artificial Intelligence

www.aiobserver.co

More from this stream

Recomended