Insights
7
mins read

How AI is Transforming Patent Intelligence in 2024

Sakari Arvela, CEO and co-founder of IPRally, sets out his forecast for 2024. He predicts that AI will continue to shape the future of IP search with most progress being made in the intersection between generalist LLMs for text analysis and generation and specialist search models like Graph AI.

2023 was a breakthrough year for AI technology, marked by the rise of generative AI with a multitude of new GPT models and advanced diffusion models.

These achievements, years in the making, were made possible by ongoing research and development far from the public eye. 

Yet, generative AI is only a part of the vast landscape of AI innovation. Specialized models are tackling unique challenges across various domains, including patent intelligence

As we look ahead to 2024, the big questions are – how will 2024 and beyond shape up for the patent industry with respect to technology? How are the AI models that work in the background of patent search evolving? Where do the Large Language Models (LLMs) fit in? What big changes are heading our way? 

Graph AI – The Patent Specialist AI

Although the media buzz is mostly around generative AI, real strides have been made in specialist AI models that solve specific problems.

The core technology behind the IPRally patent search tool is knowledge graphs: modeling the technical information contained within patents in a structured and visual way.

Why did we choose that? Because it:

  • resonates well with the mindset of a patent professional, 
  • reflects neatly the characteristics of technical information (structures, properties, functions, relations), 
  • and, most importantly, allows us to use efficient machine learning techniques, such as Graph Neural Networks (GNNs), to build a powerful patent search.  

We introduced the graphs way back in 2018 to move away from traditional Boolean queries and black-box semantic searches. The Graph AI model became the heart of IPRally’s patent search and, later, patent classification engine. 

Expanding our AI research team and efforts over the years, in 2023, we launched a new Graph Transformer and pretty soon afterwards, a more advanced version. The latest models use transformer technology similar to that of LLMs, but built specifically for “the patent problem”, in other words – to find the most relevant patents from the global patent data as quickly as possible.

The approach was quickly adopted by multiple enterprise IP departments and service providers due to the time, quality and cost benefits. Soon after, public intellectual property offices started to find their way too.

The reason for its success is the fact that it makes the IP search journey much easier. You can now find relevant patents much more quickly and accurately.

In 2024, we will continue on this track to take the technology to the next level and make each patent search project a more enjoyable task. We just can’t get enough of the “wow”s from our users!   

How Generative AI is Shaping the Future

2023 was the year that LLMs became more than just parlor tricks. 

As far back as 2019, I was using GPT generated texts in my presentations. It was fun to see the reactions when I showed the audience a patent-related quote that I later revealed had been written by AI. Now I need to think of a new way to impress audiences as ChatGPT has killed the effect!

That has been the biggest change – AI has now entered people’s everyday lives.

In addition to growing awareness, LLMs have also expanded the utility of AI in the patent space. 

The leap was so big that building tools for things like innovation facilitation, patent drafting, and deep content analysis (as well as new conversational interfaces for these) is now feasible. 

There is sure to be a lot of innovation and progress in these areas over the coming months and years. In the short term, a big everyday impact is likely to take place in the “triangle” between specialist search models like Graph AI, generalist LLMs for text analysis and generation, and professional users. 

Good control over work is important for patent people and combining these technologies smartly allows keeping the user in the driver’s seat, or more accurately – moving the user from the engine room to the driver’s seat.

Pitfalls and Benefits of Generative AI

A common misconception of LLMs like ChatGPT is that they are good for searching through information and providing reliable references to the sources.

This simply isn’t true.

By now, most people have heard of “AI hallucinations”. This refers to tools like ChatGPT reeling off “facts” that look credible, but are completely made up.

The reason is that LLMs are made to remember their vast, but limited, training data accurately and apply it to give a creative output in a given context, even if it bears no relation to the real world. 

That’s not to say that LLMs don’t have some useful applications in search systems. Far from it.

One powerful feature of LLMs is their quickly grown “context window”. Instead of training a new model, which is expensive, you can achieve a lot by feeding carefully selected data to a pre-trained model and processing it on the fly.

Here are a few concrete examples: 

  • LLMs analyze search results with precision, extracting details, analyzing claims and embodiments, and summarizing context. IPRally’s Ask AI already offers those features through a chat interface, while keeping the user in full control and mitigating the hallucination effect and missing reference problems.
  • In a Retrieval Augmented Generation (RAG) setup, LLMs can act as the “command center” for the user providing a fully conversational user experience, but still using dedicated search models to fetch factual data and sources.
  • A generative patent drafting co-pilot uses a search model optimized for claims (like IPRally) in the background for extracting prior art for patent text generation and claim scope optimization.
  • LLMs can automatically label data, streamlining the training of efficient search models.

As you can see, there are many use cases for LLMs in patent AI.

For us as a software vendor, the challenge lies in pinpointing the areas where the impact, combined with Graph AI as our core technology, and our mission to provide effortless access to patents and improve the quality of IP, is biggest. 

We also follow the internal projects and listen closely to the wishes of our customers. For many, conversational patent intelligence is high on the agenda after trying the Ask AI feature. This is no wonder, because patent work consists of a continuous chain of smaller and bigger decisions. 

A convenient AI assistant helps you uncover relevant patents, R&D insights, or even technology trends through basic everyday conversations.

Armed with these immediate results, you can make informed decisions much more quickly. 

That’s why we’re investing heavily in IPRally’s Ask AI feature, aiming to make it as helpful as possible. Adding the capability to handle large patent datasets at once is the next improvement on our list.

Addressing AI Security Concerns

Let’s finish up this exploration of the future of patent AI with a quick note on security.

Security concerns around AI, especially LLMs, are common. While this is changing for the better, part of the problem has been that these models are so large and powerful that only a few, very well funded players can develop and host them.

Uploading sensitive information to the cloud always, and justifiably, raises questions. What happens to the data? Where and how is it used? Who has access to it?

These questions are even more heightened when it comes to intellectual property.

For that reason, we’ve set out and adhere to the following principles at IPRally:

  • We keep full control of our core patent search AI. All sensitive data used and stored for search purposes by our own AI models remains within our own cloud environment, under our full control, and strongly encrypted, both at-rest and in transit.
  • We want to deliver the value of LLMs to our users quickly, but securely. Therefore, we hand-pick trustworthy third-party LLM providers, and allow our users to opt-in to such features rather than apply them as default. Our long-term aim is to host LLMs on our own servers for additional security.

In 2023, we were awarded an ISO 27001 data security certificate and, just recently the Cyber Essentials certificate, showing our commitment to data security matters, both AI-related and in general.

Take Patent Search to the Next Level

As we look back at the revolutionary year of 2023 and look towards the future, it's clear that patent AI is going to keep evolving rapidly. 

Advanced technologies like Graph AI and LLMs have heightened expectations, as well as expanded the horizons of what's possible in patent search and analysis.

In terms of intellectual property, the future promises more accuracy, speed, and efficiency. Used the right way, AI will have a positive impact on the quality of IP rights in general – an aspect that should not be forgotten.

IPRally is committed to being a key player in realizing this future offering user-friendly, secure, and efficient patent search, review, monitoring and classification tools.

Our goal for 2024 and onwards is clear – to continuously adapt, innovate, and improve. 

Discover how IPRally can enhance your patent search with a free trial.

Sakari Arvela
February 7, 2024
5 min read