Tech
2
mins read

Enhancing Patent Search: Introducing Graph Transformer 2.0

We are delighted to introduce Graph Transformer 2.0, the latest advancement in our AI model, designed to elevate your patent search experience. In this blog post, we'll delve into the upgrades in our new model and how it significantly enhances patent search.

Introduction

Our commitment to improving patent search efficiency led us to create the Graph Transformer, an advanced AI model that uses knowledge graphs, as discussed in our previous blog post. Now, with Graph Transformer 2.0, we've made a substantial leap by increasing its size from 6.5 million parameters to 89 million parameters.

Expanding Parameters for Deeper Understanding

In the world of AI, parameters can be regarded as neurons, and an increased quantity of them translates to enhanced computational capability. Graph Transformer 2.0 achieves its remarkable parameter boost by incorporating additional Graph Transformer layers. Now, with a substantial ensemble of 89 million parameters and these added layers, it can dig even deeper into relationships among concepts within patent documents, making it excel in the patent search domain.

Let's take a moment to demystify what we mean by layers in the context of deep learning. Imagine these layers as individual processing stages, each with its unique function. Just like how a chef adds multiple layers of flavors to create a complex dish, our AI model uses layers to process and refine information step by step. The more layers we have, the more nuanced the understanding of the data becomes. In the case of Graph Transformer 2.0, these layers work together to untangle the complexities of patent documents, ensuring that your searches yield the most accurate and relevant results possible.

Significant Improvements in Search Metrics

We measure the performance of the model by evaluating how many novelty (X) citations it brings to the top of the search result list, for example among the top 5 results (top-5 recall). Graph Transformer 2.0 performs exceptionally well across all search metrics, from top-5 to top-100 search recalls. Notably, it shines in the field of chemical patents, achieving relative improvements of over 20% in both top-5 and top-10 search recalls. This represents a substantial enhancement, greatly improving the quality and speed of your searches.

Relative Improvement in search recall (%) for Graph Transformer 2.0 compared to Graph Transformer: Top-5 (T5R), Top-10 (T10R), Top-25 (T25R), Top-50 (T50R), and Top-100 (T100R) recall.

Summary

Graph Transformer 2.0 marks a significant advancement in patent search technology. Its expanded parameters enhance its ability to process patent text with remarkable depth and precision. The future of patent search has arrived with Graph Transformer 2.0, promising a more efficient, accurate, and insightful patent search experience. This new model is available to all users of IPRally. If you're not already a user, you can explore its potential by contacting us to request a trial account. Start benefiting from the power of Graph Transformer 2.0 for your patent search today!

Krzysztof Sadowski
October 6, 2023
5 min read