It’s Friday night. You’ve left work, commuted home, ordered Doordash, and poured your wine for the evening. You dig into the couch cushion and grab the tv remote. The Netflix logo bursts onto the screen. It’s been an unusually long week and so you feel like a comedy tonight. You really want to watch “When Harry Met Sally”, but naturally you don’t want to type out the entire title. So you start with “Harry…”
In this day and age, you expect Netflix to instantly suggest titles based on that keyword — Harry Potter, Dirty Harry, When Harry Met Sally, etc. You want the correct suggestion (along with an array of other options), and you want it fast. If the search doesn’t pop up quickly enough, or if it feeds you irrelevant information, you might get frustrated and start digging around for your remote to switch to Hulu, HBO Max, or Apple TV.
This isn’t just true for streaming apps. Whether people are searching on Louis Vuitton’s website for the latest purse ahead of the holiday shopping season or parsing through stored data before feeding it into a predictive machine learning model, users expect the ability to quickly search through and surface relevant data.
While large companies like Netflix often build their own proprietary search software, plenty of businesses trust companies that provide software to do that for them. Why tie up developer time building a feature that has quickly become a utility, when it’s more important for most companies to focus their engineering resources on building features that increase their competitive advantage?
We’re backing Meilisearch because we believe it addresses this pain point in an effective way and has the potential to make fast, effective search engines ubiquitous across the growing number of products with searchable databases that help drive user value. In what follows, we’ll describe what Meilisearch is doing and why we believe that Meilisearch is building the search engine of the future.
The state of the search market today
Enterprise search is not a new category. Since the early days of hosted databases, search has been a critical part of the tech stack for companies. However, the incumbents in the space are not up to the task as their products are either outdated or don’t work, leaving customers underserved.
The first search engine to emerge was Elasticsearch, which was originally developed by Shay Banon in 2010. Elasticsearch was the third iteration of an open-source project called Compass and ended up being monetized by a company called Elastic N.V. This company started to sell hosted Elastic after merging with Found in 2015 and has since expanded to encompass additional features like observability and security. As a result, it has grown to a ~$6.9B company as of this year.
Algolia, another incumbent in the space, emerged two years later as a “lighter” weight search engine capable of performing enterprise search using search APIs (Application Programming Interfaces), rather than storing unstructured data in schemas like Elastic. Algolia is now a ~$2.25B company as of its latest financing round.
Naturally, both of these products have advantages and disadvantages. Elastic was developed in Java (Algolia in C++) and is better for storing vast amounts of unstructured data. Algolia enables quick and easy web searches but has struggled to scale its pricing model and find product-market fit outside of e-commerce. And although Elastic started open-source, it has since closed its ecosystem, while Algolia has been a closed-source ecosystem since the beginning.
Tailwinds in the search market
Right now, search is a fast-growing market because as the economy continues to digitize, companies are storing and interacting with more data than ever. The growth of data is exponential.
Amount of structured and unstructured data (in GB) (source)
You have to believe that if companies are storing more data, they’re going to need more powerful ways to search through it. This tailwind is what’s driving growth across the search sector.
So now there are more and more databases being created all the time, and yet products are still very slow when it comes to search. In-app or in-product search is cumbersome, and every second of latency matters to the end consumer.
Search now influences all aspects of a business. In a world where “data is the new oil” and companies are storing more data now than ever before, search has become a utility, a requirement, especially for a tech-forward business.
For consumers, if they are able to more easily (or quickly) find the product they want to buy or the service they want to use, companies reap additional revenue. On the other hand, for developers, being able to quickly find relevant data for a product or feature they are building is critical to their efficiency. There are clear business benefits to more efficient developers and happier consumers.
Yet founders who are building products don’t want to have to hire a development team to build a search tool, since it’s expensive to do. They want to leverage an existing platform that they can include in their product.
Additionally, companies are no longer differentiating on tech architecture, but rather on how they use the data available to them. We believe being able to quickly search through, contextualize, and utilize the data is how some companies will gain an enduring competitive advantage.
How Meilisearch is building the search engine of the future
Meilisearch is an open-source search engine for powering web, workflow, and application
Search. Meilisearch enables any developer that needs a powerful and accessible search engine to get up and running quickly. If you need to find data stored within your website, workflow, or application, Meilisearch is the platform to use.
A big part of what makes Meilisearch special is that it’s built on Rust, one of the most performant coding languages. Rust is very lightweight and easy to use and is often quoted as the most beloved programming language by developers. What’s more, it’s ideal for edge computing and edge search, which is where the search market is going.
This is because more and more companies are trying to run computing on the edge due to cost efficiencies. Edge computing means fewer processes are running in the cloud and moving those processes to local locations, such as on a user’s computer, an IoT device, or an edge server.
Edge computing is growing in popularity as it aims to solve challenges introduced by an increasingly cloud-centric world, including – latency (the amount of time it takes for a data packet to go from one place to another), bandwidth, and network congestion. By being suitable for edge computing, building on Rust makes Meilisearch faster and crucially saves on cloud costs
Comparison of Cloud, Edge Cloud, and Edge Computing Architectures (source)
Rust is therefore a huge differentiator for Meilisearch as it allows for quick implementation and superior performance vs. other languages, and they’ve intentionally built on Rust for that reason. Architecture is destiny, and the Meilisearch team is building the right architecture for the future. Other products in the market were built years ago, and everything has changed since then from a technical perspective.
Another big differentiator is that Meilisearch is built in open source. Meilisearch’s OS approach has conferred a product and distribution advantage seen in other projects. Companies like Confluent, MongoDB, Elastic, Hashicorp, Databricks, and others have benefited from successfully commercializing open-source software.
Open-source code is frequently more secure than proprietary code, adopting an open source strategy can compress the timeline of software development, and above all else, it’s really cost-effective. Put succinctly, it:
Enables superior cost + deployment. OSS is free from licensing
Allows companies to “try before they buy”
Results in faster deployment and time to market (as well as product iteration)
Many companies are strategic consumers of open-source software as a means to reduce the burden on their software engineering team to build everything from the ground up. There’s been some recent research estimating that open source drove between €65-€95 billion of European GDP in 2018 alone, while 90% of all cloud workloads use Linux (an open source operating system), and 82% of the smartphone market uses open source. Felicis has talked to a lot of open-source founders, they validated that Meilisearch creates a lot of value for customers
As a result of the heaviness of the implementation required for market incumbents like Elastic, and the inability to easily implement parts of Algolia due to its closed-source nature, a number of open-source projects have emerged as off-the-shelf tools for companies looking for search capabilities. Meilisearch is leading the charge of open-source tools that are popping up to service the mid-market or enterprises that find Elasticsearch/Algolia too heavily monolithic.
One of the fastest-growing GitHub projects in history
One thing that helped put Meilisearch on our radar was that it has been one of the fastest-growing GitHub projects in history in terms of stars, commits, forks, DAUs, MAUs, and downloads.
Beyond the excellent traction, product, and growing market, Meilisearch has an incredible, international team with strong technical chops building in Rust, which gives them another leg up against incumbents.
Their inspiring founding vision is to put search everywhere — on devices, in eCommerce searches where latency is so critical, and at the point of data origin.
The shift towards a more distributed tech stack bodes extremely well for the company, given the Rust architecture, which is optimized for running locally, alleviating the issues of latency and server-side bandwidth.
If you believe data is the lifeblood of modern business and if you believe businesses are moving towards a more distributed tech stack, then the inescapable conclusion is that Meilisearch is well equipped to capture massive value being created in a fast-growing market.
That’s why we are so excited to partner with the founders, Quentin and Thomas, and the rest of the team as they look to fulfill their vision of a universal, ubiquitous search, for all.