Learn back-end development by writing real code

Boot.dev Blog » Open-Source » Go-CoNLLU - Some Much Needed Machine Learning Support in Go

Go-CoNLLU - Some Much Needed Machine Learning Support in Go

By Lane Wagner on Jun 8, 2020

Curated backend podcasts, videos and articles. All free.

If you're looking to become a backend developer, or just stay up-to-date with the latest backend technologies and trends, you found the right place. Subscribe below to get a copy of our newsletter, The Boot.dev Beat, each month in your inbox. No spam, no sponsors, totally free.

Python is commonly seen as the AI/ML language, but is often a dull blade due to unsafe typing and being slow, like really slow. Many popular natural language processing toolkits only have Python APIs, and we want to see that change. At Nuvi, a social media marketing tool, we use Go for the majority of our data processing tasks because we can write simple and fast code. Today we are open-sourcing a tool that has helped make our ML lives easier in Go. Say hello to go-conllu.

🔗 What is CoNLL-U?

The Conference on Natural Language Learning (CoNNL) has created multiple file-formats for storing natural language annotations. CoNLL-U is one such format and is used by the Universal Dependency Project, which hosts many annotations of textual data. To use these corpora, we need a parser that makes it simple for developers to utilize the data.

Universal Dependencies Machine Learning Logo

Universal Dependencies

🔗 How Does Go-Conllu Help?

Go-conllu parses conllu data. It is a simple and reliable way to import conllu data into your application as Go structs.

The GoDoc can be found here with the specifics

Let’s take a look at the example quick-start code from the Readme. First, download the package.

go get github.com/nuvi/go-conllu

Then in a new project:

package main

import (

	conllu "github.com/nuvi/go-conllu"

func main() {
	sentences, err := conllu.ParseFile("path/to/model.conllu")
	if err != nil {

	for _, sentence := range sentences {
		for _, token := range sentence.Tokens {

All the sentences and tokens in the corpus will be printed to the console.

If you need a .conllu corpus file you can download the Universal Dependencies English training model here: en_ewt-ud-train.conllu

Find a problem with this article?

Report an issue on GitHub