Email or username:

Password:

Forgot your password?
Simon Willison

Just got a great first impression running Open WebUI - took one command to get it up and running (uvx --python 3.11 open-webui serve) and it noticed I had Ollama running and automatically gave me a UI to interact with those models simonwillison.net/2024/Dec/27/

5 comments
Simon Willison

Also got a reminder of how freakishly capable the tiny (1.9GB) Llama 3.2:3B model is at summarization and question answering against documents you feed into it via its impressive 128,000 token context

Prompt

How do I use SpatiaLite with Datasette?

Attachment: spatialite.rst

Response

To use SpatiaLite with Datasette, you need to install the mod_spatialite dynamic library. This can then be loaded into Datasette using the --load-extension command-line option.

You can install it on OS X by running brew update and then brew install spatialite-tools. On Linux, SpatiaLite is packaged for most distributions, so you can typically install it with a package manager like apt.

When running Datasette, you can look for SpatiaLite in common installation locations or specify the full path to the extension if it's installed elsewhere.

For example, on OS X, you might run datasette --load-extension=spatialite after installing it with Homebrew. On Linux, you could use datasette --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so.

It's also important to note that SpatiaLite adds a large number of additional SQL functions, some of which are not safe for untrusted users to execute. To secure your Datasette instance, consider disabling arbitrary SQL queries and defining canned queries with the SQL queries that use SpatiaLite functions you want people to be able to execute.
[ \ | (

@simon try granite 3.1 dense or moe. Working pretty good as well.

enamon

@simon Does it only work on text files or can it parse pdf, docx, etc files?

Daniel

@simon You have to be very careful with ollama and its default context settings since it's silently truncating inputs in many cases instead.

This bit me big time and there are open tickets about it. Two things:

1) Check the ollama server logs if you see a log line about truncating inputs

2) You can send num_ctx in their API or create a new ollama model with a bigger context and save it

Because by default if you

ollama run llama3.2

it has a limit of only 2048 🙀

Krzysztof Jeziorny

@simon That `uv` approach is tempting, as I run Open WebUI in Docker and updating it (which happens quite often) takes an eternity.

Go Up