gobuffalo

The difference between websites and web applications is that the app does display dynamic data, while a simple site can just present some static values. So far we've developed a site because nothing can ever change there without us affecting a source code. It's time to change that and finally display something from the database.

You might want to check out previous parts of this tutorial first:

Database in Buffalo

In Buffalo, when it comes to the database connection it starts and ends with markbates/pop. It is a powerful tool built on top of jmoiron/sqlx and it handles a lot of things you might not even think of if you were to do database integration yourself. First of all, it comes with tools for generating models and migrations (pop/soda) and custom migrations syntax (pop/fizz) which are so cool and easy to use!

Before diving into the details, we must have a proper configuration set up. To do that we edit .database.yml file:

# .database.yml
development:
    dialect: postgres
    database: business-card_development
    user: postgres
    password: mysecretpassword
    host: 127.0.0.1
    port: 15432
    pool: 5

test:
    url: {{envOr "TEST_DATABASE_URL" "postgres://postgres:mysecretpassword@127.0.0.1:15432/business-card_test?sslmode=disable"}}

production:
    url: {{envOr "DATABASE_URL" "postgres://postgres:mysecretpassword@127.0.0.1:15432/business-card_production?sslmode=disable"}}

As you can see, there is a possibility to use environment variables here, which is recommended for passwords, but for development/tutorial purposes we might go with a hardcoded value.

Back to the tools, soda allows you to generate models for your entities in a pretty straightforward way, as you define the name of a struct and a list of fields that should be included. If no fields are provided it still create three of them by default: id of type string and two of time.Time: created_at and updated_at. These might sound unnecessary at first, but in my opinion, they become useful at some point, so it's better to have them from the beginning. To add some custom columns, you just need to define them as column_name:type pair (list of types is available here), but how do you call it will be explained below. Apart from models, we are being handed migrations for our entity as well! By default, a fizz syntax is being used, but it can be changed to SQL if you prefer. I thought about that at first, but it turns out that what is proposed is super readable and well-documented. You can easily modify the migrations since the syntax consists of self-explanatory functions like create_table or add_column. Note, that the migrations pluralize your entity name so that Go struct joke is stored in jokes SQL table.

Why SQL? At this point, pop supports three SQL solutions: Postgres, MySQL, and SQLite.

Last, but not least, it's important to understand how all this fits into the Buffalo workflow. It all comes down to a PopTransaction middleware that is initiated at the application startup:

// actions/app.go
...
app.Use(middleware.PopTransaction(models.DB))
...

You can inject your custom database middleware, but since I don't have any special preference (and since I spent so much time explaining how pop works) I'll go with the default solution. What this middleware does under the covers is it creates an SQL transaction and inserts it into the request's buffalo.Context:

// github.com/gobuffalo/buffalo/middleware/pop_transaction.go
...
var PopTransaction = func(db *pop.Connection) buffalo.MiddlewareFunc {
    return func(h buffalo.Handler) buffalo.Handler {
        return func(c buffalo.Context) error {
            ...
            return db.Transaction(func(tx *pop.Connection) error {
                ...
                c.Set("tx", tx)
                return h(c)
            })
        }
    }
}
...

Now in each request, you can access this transaction, just need to extract tx value from the context and cast it to *pop.Connection. Awesome!

Model generation

In order to start working with some data from our Postgres database, we need to make our app aware of what can be found there. We can take advantage of Buffalo's model generator and have the code created automatically:

$ buffalo db generate model experience
v3.30.1
--> models/experience.go
--> models/experience_test.go
...
> migrations/20170831190334_create_experiences.up.fizz
> migrations/20170831190334_create_experiences.down.fizz

The problem is, that we just asked it to create experience model. What's included in it since we haven't provided any definition?

// models/experience.go
...
type Experience struct {
    ID        uuid.UUID `json:"id" db:"id"`
    CreatedAt time.Time `json:"created_at" db:"created_at"`
    UpdatedAt time.Time `json:"updated_at" db:"updated_at"`
}
...

Yeah, that's definitely not enough! We can see that apart from Go struct Buffalo created two migration files:

// migrations/xxxxx_create_experiences.up.fizz
create_table("experiences", func(t) {
    t.Column("id", "uuid", {"primary": true})
})

// migrations/xxxxx_create_experiences.down.fizz
drop_table("experiences")

The syntax is rather easy to understand, but still - it's not what we expect our model to look like. Let's reset our changes and try doing again, but with slightly modified command.

First, we need to understand how the models are generated. It uses pop/soda package, and by looking at its source code we can provide column names and their types, so we can create our model with additional fields:

$ buffalo db generate model experience company:text from:time to:time description:nulls.text
v3.30.1
--> models/experience.go
--> models/experience_test.go
...
> migrations/20170831190902_create_experiences.up.fizz
> migrations/20170831190902_create_experiences.down.fizz

Each entry containing our experience will consist of the company name, our job description (optional) and time range when we worked there. Now the struct looks much better:

type Experience struct {
    ID          uuid.UUID    `json:"id" db:"id"`
    CreatedAt   time.Time    `json:"created_at" db:"created_at"`
    UpdatedAt   time.Time    `json:"updated_at" db:"updated_at"`
    Company     string       `json:"company" db:"company"`
    From        time.Time    `json:"from" db:"from"`
    To          time.Time    `json:"to" db:"to"`
    Description nulls.String `json:"description" db:"description"`
}

Also, the migration include all the columns as well:

// migrations/xxxxx_create_experiences.up.fizz
create_table("experiences", func(t) {
    t.Column("id", "uuid", {"primary": true})
    t.Column("company", "text", {})
    t.Column("from", "timestamp", {})
    t.Column("to", "timestamp", {})
    t.Column("description", "text", {"null": true})
})

// migrations/xxxxx_create_experiences.down.fizz
drop_table("experiences")

Important note: Once the model is generated you are the owner of the code. These files are meant to be edited however you want. Also, you must realize that generating the same model again will overwrite your changes!

To apply our changes to the database we run

(business-card) $ buffalo db migrate up
...
> create_experiences

Making changes

After a while, we realized that our model should contain one more field - job position name. Since it's our code now, we need to do all changes manually. Start by creating migrations, which can be initiated by the generator which requires us to choose some migration name, and it creates two empty files;

$ buffalo db generate fizz add_experience_position
v3.30.1
> migrations/20170831191420_add_experience_position.up.fizz
> migrations/20170831191420_add_experience_position.down.fizz

Now we define up and down operations using fizz language:

// migrations/xxxxx_add_experience_position.up.fizz
add_column("experiences", "position", "string", {})

// migrations/xxxxx_add_experience_position.down.fizz
drop_column("experiences", "position")

We've done the database part, now let's update the code describing a model in an application. First, add a field to the struct:

// models/experience.go
...
type Experience struct {
    ...
    Position string `json:"position" db:"position"`
}
...

Also, since Buffalo creates validation mechanisms (which we'll use at some point, but not yet) we can also add that this new field is required and cannot be empty:

// models/experience.go
...
func (e *Experience) Validate(tx *pop.Connection) (*validate.Errors, error) {
    return validate.Validate(
        ...
        &validators.StringIsPresent{Field: e.Position, Name: "Position"},
    ), nil
}
...

Now (and after any database change) we run the migrations:

$ buffalo db migrate up
...
> add_experience_position

To make sure everything works fine, we'll try to insert an entry directly to the database:

# INSERT INTO experiences(id,company,"from","to",description,position,created_at,updated_at) VALUES ('e0741237-ee86-4a53-94d4-a7f4ba3e85dc', 'Siili Solutions', '2015-07-01', '2017-02-28', 'Developing web applications using multiple technologies, mostly in Go and Java Script (Node.js, React, Redux). Worked with MongoDB, Postgres databases.','Full Stack Developer','2017-08-31','2017-08-31');

INSERT 0 1

It worked! As you can see, three default columns (id, created_at and updated_at) required an additional effort, but it will pay off soon. It's just not a normal way in Buffalo apps to insert data manually.

Displaying data

All of the above would not mean a thing if we didn't display the data on the page. There are just two steps left: fetch the data and change the template to render it. In order to access the database, we need to extract *pop.Connection transaction (inserted there inside middleware) from request context and call queries on it:

func ResumeHandler(c buffalo.Context) error {
    tx := c.Value("tx").(*pop.Connection)
    var experience []models.Experience
    if err := tx.All(&experience); err != nil {
        return errors.Wrap(err, "failed to load experience data")
    }
    c.Set("experience", experience)
    return c.Render(200, r.HTML("resume.html"))
}

If you've ever worked with SQL in Go, this code should not look strange to you. We're fetching all data from experiences table and put it into a slice of models.Experience items. The last line before rendering the template inserts the data into context under experience key, which we'll use in resume.html template.

Finally, we can make changes to the HTML. Since we can have multiple entries in our resume, we need to iterate over the experience slice with for loop. Note that we can insert any valid Go code in plush's which helps us eg. in formatting time range the way we prefer to:

<div class="resume">
    <div class="experience">
        <h3>Experience</h3>
        <div class="experience-entries">
            <%= for (e) in experience { %>
                <div class="experience-entry">
                    <div class="experience-entry__timerange">
                        <%= e.From.Format("Jan 2006") %>
                        -
                        <%= e.To.Format("Jan 2006") %>
                    </div>
                    <div class="experience-entry__details">
                        <h4><%= e.Position %></h4>
                        <h5><%= e.Company %></h5>
                        <p><%= e.Description %></p> 
                    </div>
                </div>
            <% } %> 
        </div>

    </div>
</div>

If we now enter http://localhost:3000/resume, we'll find information from the database.

The full source code of this example is available on Github.