Skip to content
forked from fieldkit/cloud

cloud implementation for the fk platform. server and web architecture, etc...

Notifications You must be signed in to change notification settings

andreiradoi/cloud

 
 

Repository files navigation

README for cloud

Overview

This repository contains:

  1. All backend server side code, written in Golang.
  2. All public facing custom web sites, specifically the portal.
  3. PostgreSQL schema, sample/seed data, and tools.

Developers

Dependencies

In order to build this project and be able to run it yourself you’ll need the following dependencies:

NOTE: It should be possible to use npm if you’d like to avoid installing yarn.

Overview

The server side architecture is broken into two main parts:

Server: A monolothic golang service that provides the backend REST API for FieldKit. It requires a PostgreSQL instance, which is covered later. It also reads objects from S3 for a more seamless developer experience. Any write that would go to S3 will go to the local file system when running in single-machine configuration.

Portal: A Vuejs (JavaScript/TypeScript) single page application that communicates with the backend server’s API, this is the shiny and interesting stuff.

Local Setup

Clone:

git clone https://github.com/fieldkit/cloud.git

Build:

make

This will establish some local defaults suitable for a single developer machine and build all of the golang binaries. After which the portal’s assets will be bundled and then the tests for the project will run. It’s a good practice to run this periodically, especially before pushing, to ensure that tests still pass and there aren’t any serious errors generating the asset bundles.

PostgreSQL:

make restart-postgres

This will execute the necessary docker commands to start the PostgreSQL container. This container will load any SQL that’s in the schema directory. On a fresh install this will be empty but can be used later to load testing databases.

Migrations:

A fresh database install will have no schema, so it’s important to migrate the database. To do this, developers will first have to download the migration tool from:

https://github.com/golang-migrate/migrate/releases

And then copy the migrate binary into your PATH. Once that is available, you can migrate:

make migrate-up

This will execute several SQL files. It’s a good practice to run this periodically after fetching upstream changes, just to be sure the schema stays up to date.

Run the server:

./run-server.sh

This runs the server with the default developer configuration, connecting to the PostgreSQL instance.

Run the portal dev-server:

In order to view the portal locally, developers will need to serve the portal’s static assets. To speed up development it’s suggseted that developers use yarn serve to serve the portal assets and allow for automatic reloading:

cd portal
yarn install
yarn serve

As noted above you are free to use npm as well. If everything is successful you should see something like this:

App running at:
- Local:   http://localhost:8081/
- Network: http://192.168.0.100:8081/

Note that the development build is not optimized.
To create a production build, use yarn build.

With all of this done, you should be able to navigate your browser to http://127.0.0.1:8081 and get a login screen. Notice that due to CORS the hostname that’s used here is important for things to work correctly.

Database

Migrations

This repository uses the following third-party-tool for migrations:

https://github.com/golang-migrate/migrate/tree/master/cmd/migrate

Setup

  1. The link above has instructions on how to download the install the tool. Put the tool somewhere in your path.

New Migrations

  1. There’s a tool mkm.sh in this directory that will create a new migration

with the name you’ve given. The file will create new up and down SQL files in the migrations directory.

Migrating

  1. Just run make `migrate-up` to migrate your local database.
  2. To rerun a migration, run migrate-down and then you can re-run migrate-up.

Example Data

Out of the box the database is pretty boring and doesn’t even contain a user to test with and so the first step is to register a test account.

If you have a sample database, you can follow these instructions to load that instead.

Copy

Sample databases usually come as a sql.bz2 file and the first step is to place that file in a directory name schema-production inside the cloud directory.

Load

With that in place, you can run make schema-production in order to configure the PostgreSQL docker container to load the sample data.

It can take a while for PostgreSQL to load a large sample databases so you may see connection refused errors until the process is done. To check the progress you can tail the logs of the docker container.

Notes

Docker Installation (Linux)

Here’s some quick instructions on how to install and configure Docker under Ubuntu:

sudo apt-get install docker.io docker-compose nodejs
npm install -g yarn

In order to avoid having to run all docker commands with sudo, we reccomend adding your user to the docker group:

sudo gpasswd -a $USER docker
newgrp docker

With this in place, you should be able to run docker ps as your user with no errors.

EOF

About

cloud implementation for the fk platform. server and web architecture, etc...

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • PLpgSQL 46.4%
  • Go 28.8%
  • Vue 12.5%
  • TypeScript 8.1%
  • JavaScript 3.3%
  • SCSS 0.4%
  • Other 0.5%