How to quickly setup typesense as a drop-in replacement for Algolia DocSearch on TailwindUI templates

“How do you replace the search in the TailwindUI documentation template (called syntax)” or “Can I add TypeSense as Search for a Tailwind UI site?”, you might ask?

TailwindUI uses Algolia’s DocSearch on their template offering (e.g. the documentation template).

DocSearch is a great easy-to-use solution, but it’s only available for public facing websites and only free if you are a open source project. I think it’s great they are giving back to the community and wouldn’t mind at all paying money for a paid version but to my surprise I found out there is none.

You can switch to their Search product and replace the component in the TailwindUI template but you still have to build your own HTTP crawler (there’s an add-on for a custom Enterprise plan, but that sounds like $$$$).

For my use case I looking for a simple solution at a reasonable price and ideally something I could just drop in to the TailwindUI template. And that’s exactly what I found with typesense.

Basically they forked a lot of the DocSearch components when they were still open-source and now offer drop-in replacements for the client side components (and react) and a paid service for the indexing service (at a very reasonable price).

It won’t take you more 10min to replace Algolia with typesense assuming that you are using GitHub Actions as your deployment pipeline.

Step 1: Create a hosted search server with typesense cloud

First head over to https://cloud.typesense.org/ and create yourself a search cluster. For small sites the bare minimum is around 15 USD per month currently (but free to get you started). Creating the cluster takes about 5 minutes and then hit “Generate API Keys”

Note: Don’t create a collection as the crawler will do that automatically.

The file you download contains your API keys and host names which you’ll need in the next steps.

=== Typesense Cloud: xxx===

Admin API Key:
ADMIN_KEY -> Needed for the crawler

Search Only API Key:
API_KEY -> Will be added to the NextJS site

Nodes:
APP_HOST -> Url of your search server (something.a1.typesense.net [https:443])

Step 2: Replace the Algolia DocSearch libary with typesense-docsearch-react

First install the typesense react component from NPM.

npm install --save typesense-docsearch-react

Navigate to src/components/Search.jsx and change the import from

import { DocSearchModal, useDocSearchKeyboardEvents } from '@docsearch/react'

with

import { DocSearchModal, useDocSearchKeyboardEvents } from 'typesense-docsearch-react'

Next you need to adapt the configuration for the variable docSearchConfig from

const docSearchConfig = {
  appId: process.env.NEXT_PUBLIC_DOCSEARCH_APP_ID,
  apiKey: process.env.NEXT_PUBLIC_DOCSEARCH_API_KEY,
  indexName: process.env.NEXT_PUBLIC_DOCSEARCH_INDEX_NAME,
}

to

const docSearchConfig = {
   typesenseCollectionName: process.env.NEXT_PUBLIC_DOCSEARCH_INDEX_NAME,
   typesenseServerConfig: {
    nodes: [
      {
        host: process.env.NEXT_PUBLIC_DOCSEARCH_APP_HOST,
        port: '443',
        protocol: 'https',
      },
    ],
    apiKey: process.env.NEXT_PUBLIC_DOCSEARCH_API_KEY,
  },
}

Note that we are re-using the API_KEY and INDEX_NAME environment variables, but the API_ID has changed to APP_HOST.

Next we need to add these to the .env.local file. Choose an INDEX_NAME that you will need to specify for the crawler in a later step too.

NEXT_PUBLIC_DOCSEARCH_APP_HOST=xxx.a1.typesense.net
NEXT_PUBLIC_DOCSEARCH_API_KEY=xxx
NEXT_PUBLIC_DOCSEARCH_INDEX_NAME=docs

Step 3: Modify the GitHub Actions Script

There are two parts you need to add to the workflow script. First you need to add the environment variables to the build process. I use Static Web Apps on Azure, so for this I add a “env” block to the end of the “Build and Deploy” step.

- name: Build And Deploy
  id: builddeploy
  uses: Azure/static-web-apps-deploy@v1
  with:
    azure_static_web_apps_api_token: ...
    repo_token: ...
    action: ...
    app_location: '/'
    api_location: 'functions'
    output_location: ''
  env:
    NEXT_PUBLIC_DOCSEARCH_APP_HOST: xxx.a1.typesense.net
    NEXT_PUBLIC_DOCSEARCH_API_KEY: xxx
    NEXT_PUBLIC_DOCSEARCH_INDEX_NAME: docs

Last but not least you need to add the crawler after this step. This will scan your published website and update the search index after your static website has been published.

- name: Run DocSearch Scraper
  uses: celsiusnarhwal/[email protected]
  with:
    api-key: ${{ secrets.TYPESENSE_API_KEY }}
    host: xxx.a1.typesense.net
    port: 443
    protocol: https
    config: docsearch.config.json

Add a GitHub secret called TYPESENSE_API_KEY with the ADMIN_KEY (not the API_KEY) from Step 1.

Step 4: Add a the crawler config

In the root of repository add a file called docsearch.config.json with the following content:

{
  "index_name": "docs",
  "start_urls": ["https://YOUR_PUBLIC_FACING_WEBSITE/"],
  "selectors": {
    "lvl0": "article h1",
    "lvl1": "article h2",
    "lvl2": "article h3",
    "lvl3": "article h4",
    "lvl4": "article h5",
    "lvl5": "article h6",
    "text": "article header p,article p,article ol,article ul"
  }
}

This will instruct the crawler to scan your website starting from the specified URL and add the pages to the specified index. The selectors are optional but will help the search focus on the main content of the Markdoc output (instead of also crawling the nav, toolbars etc).

That’s it, let me know if you have any improvements or comments.