Skip to content

This microservice provides access and interaction with all sorts of Challenge data

Notifications You must be signed in to change notification settings

topcoder-platform/challenge-api

Repository files navigation

Topcoder Challenge API

This microservice provides access and interaction with all sorts of Challenge data.

Devlopment status

Total alertsLanguage grade: JavaScript

Deployment status

Dev: CircleCI Prod: CircleCI

Swagger definition

Intended use

  • Production API

Related repos

Prerequisites

Configuration

Configuration for the application is at config/default.js. The following parameters can be set in config files or in env variables:

  • READONLY: sets the API in read-only mode. POST/PUT/PATCH/DELETE operations will return 403 Forbidden
  • LOG_LEVEL: the log level, default is 'debug'
  • PORT: the server port, default is 3000
  • AUTH_SECRET: The authorization secret used during token verification.
  • VALID_ISSUERS: The valid issuer of tokens.
  • AUTH0_URL: AUTH0 URL, used to get M2M token
  • AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token
  • AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token
  • TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token
  • AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token
  • AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token
  • BUSAPI_URL: Bus API URL
  • KAFKA_ERROR_TOPIC: Kafka error topic used by bus API wrapper
  • AMAZON.AWS_ACCESS_KEY_ID: The Amazon certificate key to use when connecting. Use local dynamodb you can set fake value
  • AMAZON.AWS_SECRET_ACCESS_KEY: The Amazon certificate access key to use when connecting. Use local dynamodb you can set fake value
  • AMAZON.AWS_REGION: The Amazon certificate region to use when connecting. Use local dynamodb you can set fake value
  • AMAZON.IS_LOCAL_DB: Use Amazon DynamoDB Local or server.
  • AMAZON.DYNAMODB_URL: The local url if using Amazon DynamoDB Local
  • AMAZON.ATTACHMENT_S3_BUCKET: the AWS S3 bucket to store attachments
  • ES: config object for Elasticsearch
  • ES.HOST: Elasticsearch host
  • ES.API_VERSION: Elasticsearch API version
  • ES.ES_INDEX: Elasticsearch index name
  • ES.ES_REFRESH: Elasticsearch refresh method. Default to string true(i.e. refresh immediately)
  • FILE_UPLOAD_SIZE_LIMIT: the file upload size limit in bytes
  • OPENSEARCH: Flag to use Opensearch NPM instead of Elasticsearch
  • RESOURCES_API_URL: TC resources API base URL
  • GROUPS_API_URL: TC groups API base URL
  • PROJECTS_API_URL: TC projects API base URL
  • CHALLENGE_MIGRATION_APP_URL: migration app URL
  • TERMS_API_URL: TC Terms API Base URL
  • COPILOT_RESOURCE_ROLE_IDS: copilot resource role ids allowed to upload attachment
  • HEALTH_CHECK_TIMEOUT: health check timeout in milliseconds
  • SCOPES: the configurable M2M token scopes, refer config/default.js for more details
  • M2M_AUDIT_HANDLE: the audit name used when perform create/update operation using M2M token
  • FORUM_TITLE_LENGTH_LIMIT: the forum title length limit

You can find sample .env files inside the /docs directory.

Available commands

  1. Drop/delete tables: npm run drop-tables
  2. Creating tables: npm run create-tables
  3. Seed/Insert data to tables: npm run seed-tables
  4. Initialize/Clear database in default environment: npm run init-db
  5. View table data in default environment: npm run view-data <ModelName>, ModelName can be Challenge, ChallengeType, AuditLog, Phase, TimelineTemplateor Attachment
  6. Create Elasticsearch index: npm run init-es, or to re-create index: npm run init-es force
  7. Synchronize ES data and DynamoDB data: npm run sync-es
  8. Start all the depending services for local deployment: npm run services:up
  9. Stop all the depending services for local deployment: npm run services:down
  10. Check the logs of all the depending services for local deployment: npm run services:logs
  11. Initialize the local environments: npm run local:init
  12. Reset the local environments: npm run local:reset

Notes

  • The seed data are located in src/scripts/seed

Local Deployment

  1. Make sure to use Node v10+ by command node -v. We recommend using NVM to quickly switch to the right version:

    nvm use
  2. 📦 Install npm dependencies

    # export the production AWS credentials to access the topcoder-framework private repos in AWS codeartifact
    aws codeartifact login --tool npm --repository topcoder-framework --domain topcoder --domain-owner 409275337247 --region us-east-1 --namespace @topcoder-framework
    
    # install dependencies
    yarn install
  3. ⚙ Local config
    In the challenge-api root directory create .env file with the next environment variables. Values for Auth0 config should be shared with you on the forum.

    # Auth0 config
    AUTH0_URL=
    AUTH0_PROXY_SERVER_URL=
    AUTH0_AUDIENCE=
    AUTH0_CLIENT_ID=
    AUTH0_CLIENT_SECRET=
    
    # Locally deployed services (via docker-compose)
    IS_LOCAL_DB=true
    DYNAMODB_URL=http://localhost:8000
    • Values from this file would be automatically used by many npm commands.
    • ⚠️ Never commit this file or its copy to the repository!
  4. 🚢 Start docker-compose with services which are required to start Topcoder Challenges API locally

    npm run services:up
  5. ♻ Update following two parts:

  1. ♻ Create tables.

    npm run create-tables
    # Use `npm run drop-tables` to drop tables.
  2. ♻ Init DB, ES

    npm run local:init

    This command will do 3 things:

  • create Elasticsearch indexes (drop if exists)
  • Initialize the database by cleaning all the records.
  • Import the data to the local database and index it to ElasticSearch
  1. 🚀 Start Topcoder Challenge API

    npm start

    The Topcoder Challenge API will be served on http://localhost:3000

Production deployment

  • TBD

Running tests

Configuration

Test configuration is at config/test.js. You don't need to change them. The following test parameters can be set in config file or in env variables:

  • ADMIN_TOKEN: admin token
  • COPILOT_TOKEN: copilot token
  • USER_TOKEN: user token
  • EXPIRED_TOKEN: expired token
  • INVALID_TOKEN: invalid token
  • M2M_FULL_ACCESS_TOKEN: M2M full access token
  • M2M_READ_ACCESS_TOKEN: M2M read access token
  • M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token
  • S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to localhost:9000

Prepare

  • Start Local services in docker.
  • Create DynamoDB tables.
  • Initialize ES index.
  • Various config parameters should be properly set.

Seeding db data is not needed.

Running unit tests

To run unit tests alone

npm run test

To run unit tests with coverage report

npm run test:cov

Running integration tests

To run integration tests alone

npm run e2e

To run integration tests with coverage report

npm run e2e:cov

Verification

Refer to the verification document Verification.md

Notes

  • after uploading attachments, the returned attachment ids should be used to update challenge; finally, attachments have challengeId field linking to their challenge, challenge also have attachments field linking to its attachments, this will speed up challenge CRUDS operations.

  • In the app-constants.js Topics field, the used topics are using a test topic, the suggested ones are commented out, because these topics are not created in TC dev Kafka yet.

About

This microservice provides access and interaction with all sorts of Challenge data

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages