Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs Release 2024-04-001 #296

Merged
merged 55 commits into from
Apr 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
1ac186e
[update] - for qsv2
Mar 28, 2024
3e8ddc9
[test] - test only
Mar 28, 2024
820630f
[test] - test only
Mar 28, 2024
45ab10a
[test] - revert test changes
Mar 28, 2024
2feec66
[test] - remove social plugin temporarily
Mar 28, 2024
371d869
[test] - reinstate social and reduce description length
Mar 28, 2024
ba92211
[test] - try changing font
Mar 28, 2024
1e1fc99
[test] - revert recent change
Mar 28, 2024
3d8ccbb
[test] - revert back
Mar 28, 2024
bc2109e
[fix] - remove social plugin - faulty
Mar 28, 2024
06346c2
[wip] - update influxdb variables
Mar 28, 2024
3e9ae3a
Merge pull request #292 from quixio/tonybedford/sc-55884/improve-time…
tbedford Apr 2, 2024
41e203c
[fix] - add social plugin back in
Apr 2, 2024
924dc6b
Merge pull request #291 from quixio/tonybedford/sc-55822/update-strea…
tbedford Apr 2, 2024
b320e8f
[wip] - cli first approach
Apr 2, 2024
740c428
[wip] - move connectors back temporarily
Apr 2, 2024
94ff40a
[wip] - rework landing page
Apr 2, 2024
6c1517b
[wip] - remove connectors
Apr 2, 2024
9a04271
[wip] - continue journey
Apr 3, 2024
bae971c
[fix] - tweaks based on latest csv data
Apr 3, 2024
7b22187
[wip] - add exploration with qs v2
Apr 3, 2024
6660657
[wip] - expand intro
Apr 3, 2024
d801b23
[wip] - add some comments
Apr 3, 2024
4a4b6cf
[wip] - move integrations back to top level
Apr 4, 2024
37fd74a
[wip] - improvements
Apr 4, 2024
c058c52
[wip] - add cli
Apr 4, 2024
6e18319
[wip] - further improvements
Apr 4, 2024
22c9a75
[wip] - placeholder text for concepts
Apr 4, 2024
171e4df
[fix] - typo
Apr 4, 2024
c6dba27
[fix] - title too big for description on social card
Apr 4, 2024
6257ab8
[wip] - tweak based on feedback
Apr 5, 2024
d23a633
[wip] - restructure kb
Apr 5, 2024
0deda6d
[wip] - extend concept descriptions
Apr 5, 2024
cab463a
[add] - field keys and use qs 2.4.1 application constructor
Apr 5, 2024
1b490c6
Merge pull request #294 from quixio/tonybedford/sc-56021/changes-to-i…
tbedford Apr 5, 2024
110f823
[wip] - make overview titles more descriptive
Apr 8, 2024
41b6153
[wip] - remove references to v2
Apr 8, 2024
430916e
[wip] - replace Application.Quix with Application() as no longer requ…
Apr 8, 2024
5accfb2
Merge branch 'dev' into tonybedford/sc-55708/implement-cli-first-deve…
tbedford Apr 8, 2024
dec347c
[wip] - minor tweak to git url
Apr 8, 2024
2ffdbab
Merge branch 'tonybedford/sc-55708/implement-cli-first-developer-jour…
Apr 8, 2024
548231c
[fix] - step 3 you dont need pat
Apr 8, 2024
958ba3c
[add] - next step
Apr 8, 2024
5d6c7d2
[wip] - add note on multiple platforms in install guide
Apr 8, 2024
a48ed1e
[update] - changelog
Apr 8, 2024
cf21b9a
Merge pull request #295 from quixio/update-changelog-20240408-01
tbedford Apr 8, 2024
fa29f20
[chore] - split concepts into own topics for later expansion
Apr 8, 2024
5aeb782
[add] - more explanation to transform
Apr 8, 2024
1028d7d
[wip] - improve explanations
Apr 8, 2024
66ee480
[fix] - transforms no longer contain sdf in name
Apr 8, 2024
bcd02d0
[fix] - add tip to clarify process
Apr 9, 2024
1628b68
[fix] - reword for clarity
Apr 9, 2024
9c3c3a6
[fix] - modify tip and project creation based on feedback
Apr 9, 2024
d128408
[chore] - clarification of what you're doing
Apr 9, 2024
a043b54
Merge pull request #293 from quixio/tonybedford/sc-55708/implement-cl…
tbedford Apr 9, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,14 @@ Changelogs for previous years can be found [here](#changelog-archives).
* [2022](./changelogs/2022-archive.md)
* [2023](./changelogs/2023-archive.md)

## 2024-04-02 | 08 APRIL 2024

`BUG FIXES`

- Secrets were being filtered by environment
- Variables weren't updating in the deployments dialogue
- External topics weren't being recognised by new deployments

## 2024-03-02-topic-details-hf | 13 MARCH 2024

`IMPROVEMENTS`
Expand Down Expand Up @@ -195,7 +203,7 @@ Changelogs for previous years can be found [here](#changelog-archives).
- Available the first version of the Quix CLI
- Try it out from here: https://github.com/quixio/quix-cli
- Permission system (CLI only)
- New permission system allowing granular permission using roles such as Admin, Manager, Editor and Viewer on platform resources like Environments and Projects. More information here: https://quix.io/docs/get-started/cli.html#setting-permissions
- New permission system allowing granular permission using roles such as Admin, Manager, Editor and Viewer on platform resources like Environments and Projects. More information here: https://quix.io/docs/kb/cli.html#setting-permissions
- Added support for Git-Submodules on our Git integration
- Private Code Samples support for BYOC clusters
- Project settings refactor
Expand Down
4 changes: 2 additions & 2 deletions docs/apis/portal-api/overview.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
---
title: Portal API
description: The Portal API gives access to the Portal interface, enabling you to programmatically control projects, environments, applications, and deployments.
description: The Portal API gives access to the Portal interface.
---

# Overview
# Overview - Portal API

The Portal API gives access to the Portal interface, enabling you to programmatically control projects, environments, applications, and deployments.

2 changes: 1 addition & 1 deletion docs/apis/query-api/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Query API
description: The Query API enables you to fetch persisted data stored in the Quix platform.
---

# Overview
# Overview - Query API

The Query API enables you to fetch persisted data stored in the Quix platform. You can use it for exploring the platform, prototyping applications, or working with stored data in any language with HTTP capabilities.

Expand Down
2 changes: 1 addition & 1 deletion docs/apis/streaming-reader-api/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Streaming Reader API
description: Streaming Reader API supports real-time data streaming over WebSockets.
---

# Overview
# Overview - Streaming Reader API

Quix supports real-time data streaming over WebSockets (or Long Polling depending on client support).

Expand Down
2 changes: 1 addition & 1 deletion docs/apis/streaming-writer-api/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Streaming Writer API
description: Streaming Writer API supports real-time data streaming over HTTP or WebSockets.
---

# Overview
# Overview - Streaming Writer API

The Streaming Writer API enables you to stream data into a Quix topic using HTTP endpoints or Microsoft's [SignalR](https://learn.microsoft.com/en-us/aspnet/signalr/overview/getting-started/introduction-to-signalr){target=_blank} technology. You can use the Streaming Writer API from any language or web client with an HTTP (REST) or WebSockets interface.

Expand Down
2 changes: 1 addition & 1 deletion docs/create/create-environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ Each of these is described briefly in the following sections.

### Quix broker

The simplest and most convenient choice is to use Quix-managed Kafka. No installation of Kafka is required, and configuration can be done through the UI if you need to change the sensible default values. Very little knowledge of Kafka is expected, beyond basic familiarity with concepts such as [topics](../get-started/glossary.md#topic).
The simplest and most convenient choice is to use Quix-managed Kafka. No installation of Kafka is required, and configuration can be done through the UI if you need to change the sensible default values. Very little knowledge of Kafka is expected, beyond basic familiarity with concepts such as [topics](../kb/glossary.md#topic).

There is a small charge for storage for messages persisted in a topic:

Expand Down
5 changes: 1 addition & 4 deletions docs/deploy/state-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,10 +63,7 @@ def count_messages(value: dict, state: State):

load_dotenv()

app = Application.Quix(
consumer_group="cpu_load",
auto_create_topics=True,
)
app = Application()

topic = app.topic('cpu-load')
sdf = app.dataframe(topic)
Expand Down
1 change: 0 additions & 1 deletion docs/develop/authentication/quix-streams.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ If you're using Quix Streams for local development, you'll need to set some envi

```
Quix__Sdk__Token="sdk-12345"
Quix__Portal__Api="portal-api.platform.quix.io"
```

You can read the documentation on [obtaining your SDK token](./streaming-token.md), also known as the streaming token.
Expand Down
6 changes: 3 additions & 3 deletions docs/develop/authentication/streaming-token.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ If you are looking for a bearer token to access the Quix APIs, such as the Porta

The streaming token is primarily used to authenticate the [Quix Streams client library](https://quix.io/docs/quix-streams/introduction.html).

When using it with `QuixStreamingClient`, you no longer need to provide all broker credentials manually, they’ll be acquired when needed and set up automatically.
When using Quix Streams in Quix Cloud, you no longer need to provide all broker credentials manually, they’ll be acquired when needed and set up automatically.

!!! warning

Expand All @@ -28,9 +28,9 @@ Having two keys lets you update your services without interruption, as both `Tok

You have two main options regarding how you rotate:

1. The easiest way to rotate comes with some service downtime. This assumes you do not directly set the token for your `QuixStreamingClient`, instead you let Quix take care of it for you by using the default environment variable. In this scenario all you have to do is rotate keys, stop and start all your deployments. Until a service is restarted it’ll try to communicate with Quix using the deactivated token. If you’re using local environments, those need to be updated manually.
1. The easiest way to rotate comes with some service downtime. This assumes you do not directly set the token for your Quix Streams code, instead you let Quix take care of it for you by using the default environment variable. In this scenario all you have to do is rotate keys, stop and start all your deployments. Until a service is restarted it’ll try to communicate with Quix using the deactivated token. If you’re using local environments, those need to be updated manually.

2. The alternative option is more complicated, but you can achieve no downtime. This requires you to set a new environment variable you control. This should point to the token to be used. Provide the value of this environment variable to `QuixStreamingClient` by passing it as an argument. Once you have that, set the value of this environment variable to `Token 2` and start your services. When you’re sure you replaced the tokens for all services, rotate your keys.
2. The alternative option is more complicated, but you can achieve no downtime. This requires you to set a new environment variable you control. This should point to the token to be used. Provide the value of this environment variable to your Quix Streams code by passing it as an argument. Once you have that, set the value of this environment variable to `Token 2` and start your services. When you’re sure you replaced the tokens for all services, rotate your keys.

!!! note

Expand Down
5 changes: 1 addition & 4 deletions docs/develop/integrate-data/inbound-webhooks.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,7 @@ import json
import hmac
import hashlib

app = Application.Quix(
consumer_group="sample_webhook_group",
auto_create_topics=True,
)
app = Application()

serializer = JSONSerializer()
output_topic = app.topic(os.environ["output"])
Expand Down
5 changes: 1 addition & 4 deletions docs/develop/integrate-data/polling.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,7 @@ from quixstreams import Application
from quixstreams.models.serializers.quix import JSONSerializer, SerializationContext
import time, os, requests

app = Application.Quix(
consumer_group="sample_consumer_group",
auto_create_topics=True,
)
app = Application()

serializer = JSONSerializer()
output_topic = app.topic(os.environ["output"])
Expand Down
5 changes: 1 addition & 4 deletions docs/develop/integrate-data/quix-streams.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,7 @@ def get_cpu_load():
"timestamp": int(time.time_ns()),
}

app = Application.Quix(
consumer_group="cpu_load",
auto_create_topics=True,
)
app = Application()

serializer = JSONSerializer()
output_topic = app.topic("cpu-load")
Expand Down
2 changes: 1 addition & 1 deletion docs/develop/integrate-data/read-csv.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ from quixstreams import Application
load_dotenv()

# Create an Application to connect to the Quix broker with SDK token
app = Application.Quix()
app = Application()

# Define an output topic
output_topic = app.topic(os.environ["output"])
Expand Down
5 changes: 1 addition & 4 deletions docs/develop/integrate-data/web-app.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,7 @@ from datetime import datetime
from waitress import serve
import os, json

app = Application.Quix(
consumer_group="sample_consumer_group",
auto_create_topics=True,
)
app = Application()

serializer = JSONSerializer()
output_topic = app.topic(os.environ["output"])
Expand Down
2 changes: 1 addition & 1 deletion docs/develop/process/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@ title: Processing data
description: This section of the documentation covers processing your data.
---

# Overview
# Overview - Processing data

This section of the documentation covers **processing your data**.
2 changes: 1 addition & 1 deletion docs/develop/process/timeseries-events.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,5 +97,5 @@ In the above code, the event generating code has been removed for simplicity. No

* [Example code](https://github.com/quixio/tutorial-code/blob/main/generate-events/README.md){target=_blank} - the complete code for the example.
* [Quix Streams](https://quix.io/docs/quix-streams/introduction.html) - documentation on data formats, publishing, and subscribing to topics.
* [Quix Tour](../../get-started/quixtour/overview.md) - generates processing based on threshold triggering.
* [Quix Tour](../../quix-cloud/quixtour/overview.md) - generates processing based on threshold triggering.
* [Currency alerting](../../tutorials/currency-alerting/currency-alerting.md) - tutorial on generating events based on a threshold.
92 changes: 92 additions & 0 deletions docs/get-started/build-cli.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# Build a pipeline using Quix CLI

In previous sections of the documentation you explored using Quix Streams. You now continue on your command-line journey by installing the Quix CLI, and then using it to connect with Quix Cloud. You create a simple project on the command line, and sync it with your Quix Cloud pipeline view.

## Step 1: Install Quix CLI

```
curl -fsSL https://github.com/quixio/quix-cli/raw/main/install.sh | sudo bash
```

For further details on installation, including instructions for Microsoft Windows, see the [install guide](https://github.com/quixio/quix-cli?tab=readme-ov-file#installation-of-quix-cli){target=_blank}.

## Step 2: Sign up to Quix Cloud for free

Sign up [here](https://portal.platform.quix.io/self-sign-up){target=_blank}.

!!! tip

If you are prompted to create a project, you can stop at this point, and proceed to the next step in this tutorial. You will come back to complete the project creation wizard at a later step, because you are going to base your Quix project on a Git project you are yet to create.

## Step 3: Log in using the CLI

```
quix login
```

If you're not logged into Cloud, you'll be prompted to log in.

## Step 4: Create a Git repository

Create a Git repo where you can store your files, for example you could use GitHub. Create a repo initialized with a `README.md` file, so it can be cloned more easily.

## Step 5: Clone your Git repo into your local project directory

For example, if your GitHub repo is named `cli-app`:

```
git clone <url-to-git>/cli-app
cd cli-app
```

## Step 6: Initialize your project as a Quix project

In your Git project directory, enter:

```
quix local init
```

This initializes your Quix project with a `quix.yaml` file, which describes your Quix project.


## Step 7: Create your application locally

Now create a sample application:

```
quix local app create starter-transformation
```

This creates a starter transformation for you. You can explore the files created locally for you. The `main.py` code will look familiar to you if you've tried the [previous sections](./welcome.md) of the documentation.


## Step 8: Sync your application

To sync your application, change into the `Starter transformation` directory and enter:

```
quix local deploy --push --sync
```

This updates your `quix.yaml` project file, and pushes all changes to your Git repository.

## Step 9: In Quix Cloud create a project

In this step you create a project in Quix Cloud from your Git repository.

1. Return to Quix Cloud.
2. Select `Quix advanced configuration` to continue creation of your project.
3. Select your Git provider.
4. Link the project to your Git repository using the guide provided for your chosen Git provider.
4. Sync Quix Cloud to your project by clicking the `Sync environment` button.

## Step 10: See your pipeline running

Go to pipeline view, and see your pipeline running, with your Starter transformation.

![Pipeline running](../images/starter-transform.png)

## Next steps

* [Read the Quix CLI documentation](../kb/cli.md).
64 changes: 64 additions & 0 deletions docs/get-started/consume.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# Consume data

You'll now write some code that simply consumes data from a topic and prints it out.

## Step 1: Write your consumer code

Write your consumer code:

``` python
from quixstreams import Application
from datetime import timedelta
import json

# connect to your local Kafka broker
app = Application(
broker_address="localhost:9092",
consumer_group="consume-v1",
auto_offset_reset="earliest",
)

# configure the input topic to subscribe to (you'll read data from this topic)
input_topic = app.topic("cpu-load")

# consume (read) messages from the input topic
sdf = app.dataframe(topic=input_topic)

# print every row
sdf = sdf.update(lambda row: print(json.dumps(row)))

if __name__ == "__main__":
# run the application and process all inbound messages using the sdf pipeline
app.run(sdf)
```

Save this in a file names `consumer.py`.

## Step 2: Run your consumer

``` python
python3 consumer.py
```

You are now subscribing to data on the `cpu-load` topic.

Each message received is printed out as JSON:

``` json
{
"cpu_load": 5.7,
"memory": {
"total": 0,
"used": 0,
"free": 0,
"percent": 0,
"sin": 90937131008,
"sout": 483672064
},
"timestamp": 1712238254512946000
}
```

## Next step

* [Process data](./process.md) - process streaming data in a Kafka topic in real time.
Loading
Loading