Skip to content

Commit

Permalink
v 0.11 Doc Edits Part 2 (#2416)
Browse files Browse the repository at this point in the history
Editing and updating screenshots.
  • Loading branch information
jfermi authored Apr 20, 2023
1 parent edd9a5e commit cc5363d
Show file tree
Hide file tree
Showing 39 changed files with 73 additions and 66 deletions.
16 changes: 8 additions & 8 deletions docs/docs/concepts/assertions.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,35 +2,35 @@

Test Specifications may be added to a trace to set a value for a step in the trace to determine success or failure. If test specs have already been added to a test, they will be on the Test screen:

![Test Spec List](../img/test-spec-list-0.6.png)
![Test Spec List](../img/test-spec-list-0.11.png)

After you have created a test and your test run is complete, click the **Add Test Spec** button at the bottom right of the Test screen.

![Add Test Spec](../img/add-test-spec-0.6.png)
![Add Test Spec](../img/add-test-spec-0.11.png)

The **Add Test Spec** dialog opens.

![Create Test Spec](../img/create-test-spec-0.6.png)
![Create Test Spec](../img/create-test-spec-0.11.png)

The span that the new test spec will apply to is hightlighted in the graph view on the left:

![Selected Span](../img/selected-span-0.6.png)
![Selected Span](../img/selected-span-0.11.png)

To add an assertion to a span, click the first drop down to see the list of attributes that apply to the selected span:

![Assertion Attributes](../img/assertion-attributes-0.6.png)
![Assertion Attributes](../img/assertion-attributes-0.11.png)

Then select the operator for your assertion:

![Assertion Operators](../img/assertion-operators-0.6.png)
![Assertion Operators](../img/assertion-operators-0.11.png)

And add the value for comparison:

![Assertion Values](../img/assertion-values-0.6.png)
![Assertion Values](../img/assertion-values-0.11.png)

Finally, you can give your test spec an optional name and click **Save Test Spec**:

![Save Test Spec](../img/save-test-spec-0.6.png)
![Save Test Spec](../img/save-test-spec-0.11.png)


<!--- You can also create assertions by hovering over the **+** sign to the right of an attribute in the trace.
Expand Down
14 changes: 7 additions & 7 deletions docs/docs/concepts/expressions.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Create variables in Tracetest based on the trace obtained by the test to enable

### **Arithmetic Operations**

Sometimes we need to manipulate data to ensure our test data is correct. As an example we will us a purchase operation. How you would make sure that after the purchase the product inventory is smaller than before? For this, we might want to use arithmetic operations:
Sometimes we need to manipulate data to ensure our test data is correct. As an example, we will use a purchase operation. How you would make sure that, after the purchase, the product inventory is smaller than before? For this, we might want to use arithmetic operations:

```css
attr:product.stock = attr:product.stok_before_purchase - attr:product.number_bought_items
Expand All @@ -35,7 +35,7 @@ attr:product.stock = attr:product.stok_before_purchase - attr:product.number_bou
Some tests might require strings to be compared, but maybe you need to generate a dynamic string that relies on a dynamic value. This might be used in an assertion or even in the request body referencing an environment variable.

```css
attr:error.message = "could not withdraw ${attr:withdraw.amount}, your balance is insufficient"
attr:error.message = "Could not withdraw ${attr:withdraw.amount}, your balance is insufficient."
```

Note that within `${}` you can add any expression, including arithmetic operations and filters.
Expand All @@ -58,15 +58,15 @@ If multiple values are matched, the output will be a flat array containing all v
'{ "array": [{"name": "Jorge", "age": 27}, {"name": "Tim", "age": 52}]}' | json_path '$.array[*]..["name", "age"] = '["Jorge", 27, "Tim", 52]'
```

#### **Regex**
Filters part of the input that match a regex. Imagine you have a specific part of a text that you want to extract:
#### **RegEx**
Filters part of the input that match a RegEx. Imagine you have a specific part of a text that you want to extract:

```css
'My account balance is $48.52' | regex '\$\d+(\.\d+)?' = '$48.52'
```

#### **Regex Group**
If matching more than one value is required, you can define groups for your regex and extract multiple values at once.
#### **RegEx Group**
If matching more than one value is required, you can define groups for your RegEx and extract multiple values at once.

Wrap the groups you want to extract with parentheses.

Expand All @@ -90,7 +90,7 @@ You can select the last item from a list by specifying `'last'` as the argument

### **Length**

Return the size of the input array. If it's a single value, it will return 1. Otherwise it will return `length(input_array)`.
Returns the size of the input array. If it's a single value, it will return 1. Otherwise it will return `length(input_array)`.

```css
'{ "array": [1, 2, 3] }' | json_path '$.array[*]' | length = 3
Expand Down
8 changes: 4 additions & 4 deletions docs/docs/concepts/transactions.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# Transactions

Most End-to-End tests are not simple to run. They require some setup before the actual test is run. Actions like creating a new user, removing all items from a cart, etc. So, it's important that you can execute multiple steps as part of your test suite. Tracetest introduces the concept of **Transactions** to achieve this goal.
Most End-to-End tests are not simple to run. They require some setup before the actual test is run. Actions like creating a new user, removing all items from a cart, etc. It is important that you can execute multiple steps as part of your test suite. Tracetest introduces the concept of **Transactions** to achieve this goal.

## What is a transaction?
## What is a Transaction?
A transaction is defined as a group of steps that are executed in the defined order and can access information exported by previous step executions. Each step is a test.

## Chaining Tests
The main benefit of using transactions is to be able to chain tests together and use values obtained from a test in a subsequent test.
The main benefit of using transactions is to chain tests together and use values obtained from a test in a subsequent test.

### How Values are Shared by Tests
When a transaction is run, a context object is created with information about that specific run. One of those pieces of information is an `environment variables` object, which is empty by default. If the transaction is run when referencing an [environment](./environments), all values from the selected environments will be copied to the `environment variables` object.
Expand All @@ -33,7 +33,7 @@ This would create an output called `TIME_CANCEL_SUBSCRIPTION_MESSAGE_OBTAINED` t

### Transactions Execution Flow

Transaction steps are executed sequentially. A next step is only executed after the previous step finishes executing successfully. A successful step is one which managed to trigger an operation and received a trace back from the data store. Failing assertions do not make a transaction stop executing the next steps.
Transaction steps are executed sequentially. A next step is only executed after the previous step finishes executing successfully. A successful step is one which managed to trigger an operation and received a trace back from the data store. Failing assertions do not stop a transaction from executing the next steps.

Examples:

Expand Down
4 changes: 2 additions & 2 deletions docs/docs/concepts/versioning.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Versioning

As your system evolves, your tests tend to do the same. However, that might be confusing if you don't have a versioning mechanism in place. Imagine that you wrote a new test for the version `v0.5.0` of your application. After some months, your application is in version `v0.13.7`. Most likely, your tests changed as you moved your application forward. But without versioning, if you revisit that first test you created, it will look like exactly the one you use today instead of the test you originally wrote. That happens because while you have multiple versions of your application, you only keep track of one version of your tests: the current version. So there is no way of going back in time and seeing what a test looked like in the past.
As your system evolves, your tests tend to do the same. However, that might be confusing if you don't have a versioning mechanism in place. Imagine that you wrote a new test for the version `v0.5.0` of your application. After some months, your application is in version `v0.13.7`. Most likely, your tests changed as you moved your application forward. But without versioning, if you revisit that first test you created, it will look like exactly the one you use today instead of the test you originally wrote. That happens because while you have multiple versions of your application, you only keep track of one version of your tests: the current version. So there is no way to go back in time and see what a test looked like in the past.

**But that is not a problem if you use Tracetest. It has versioning built-in!**

## **How It Works**
Once you create a test, it is tagged as the initial version (`v1`). Every time you change something in your test (edit its identification details, add assertions, change selectors, etc) Tracetest detects those changes and increase the version by 1. If no changes were made, the version is kept untouched.
Once you create a test, it is tagged as the initial version (`v1`). Every time you change something in your test (edit its identification details, add assertions, change selectors, etc), Tracetest detects those changes and increases the version by 1. If no changes were made, the version is kept untouched.

### **Change Detection**
These are the fields of a test that are checked to verify if it has changed:
Expand Down
Binary file added docs/docs/img/add-test-spec-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/all-tests-list-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/assertion-attributes-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/assertion-operators-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/assertion-values-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/awaiting-trace-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/choose-example-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/choose-example-pokemon-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/choose-trigger-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/create-button-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/create-test-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/create-test-spec-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/export-trace-options-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/exports-junit-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/exports-test-definition-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/finished-trace-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/main-screen-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/provide-addl-information-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/run-test-and-option-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/save-test-spec-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/select-test-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/selected-span-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/test-spec-list-0.11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/img/test-tab-0.11.png
Binary file added docs/docs/img/tests-actions-0.11.png
Binary file added docs/docs/img/timeline-view-0.11.png
Binary file added docs/docs/img/trace-tab-0.11.png
Binary file added docs/docs/img/trace-tab-icons-0.11.png
20 changes: 10 additions & 10 deletions docs/docs/web-ui/creating-test-specifications.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,35 +2,35 @@

Test Specifications may be added to a trace to set a value for a step in the trace to determine success or failure. If test specs have already been added to a test, they will be on the Test screen:

![Test Spec List](../img/test-spec-list-0.6.png)
![Test Spec List](../img/test-spec-list-0.11.png)

After you have created a test and your test run is complete, click the **Add Test Spec** button at the bottom right of the Test screen.

![Add Test Spec](../img/add-test-spec-0.6.png)
![Add Test Spec](../img/add-test-spec-0.11.png)

The **Add Test Spec** dialog opens.
The **Add Test Spec** dialog opens. You can choose an example test spec from the drop down.

![Create Test Spec](../img/create-test-spec-0.6.png)
![Create Test Spec](../img/create-test-spec-0.11.png)

The span that the new test spec will apply to is highlighted in the graph view on the left:

![Selected Span](../img/selected-span-0.6.png)
![Selected Span](../img/selected-span-0.11.png)

To add an assertion to a span, click the first drop down to see the list of attributes that apply to the selected span:
To add an assertion to a span, click in the Attribute field to see the list of attributes that apply to the selected span:

![Assertion Attributes](../img/assertion-attributes-0.6.png)
![Assertion Attributes](../img/assertion-attributes-0.11.png)

Then select the operator for your assertion:

![Assertion Operators](../img/assertion-operators-0.6.png)
![Assertion Operators](../img/assertion-operators-0.11.png)

And add the value for comparison:

![Assertion Values](../img/assertion-values-0.6.png)
![Assertion Values](../img/assertion-values-0.11.png)

Finally, you can give your test spec an optional name and click **Save Test Spec**:

![Save Test Spec](../img/save-test-spec-0.6.png)
![Save Test Spec](../img/save-test-spec-0.11.png)


<!--- You can also create assertions by hovering over the **+** sign to the right of an attribute in the trace.
Expand Down
34 changes: 19 additions & 15 deletions docs/docs/web-ui/creating-tests.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,49 @@
# Creating Tests

![Main Screen](../img/main-screen-0.6.png)
![Main Screen](../img/main-screen-0.11.png)

Click the **Create Test** button and the **Create New Test** dialog appears:
Click the **Create** button and select **Create New Test** in the drop down:

![Create a Test Button](../img/create-test-button-0.6.png)
![Create a Test Button](../img/create-button-0.11.png)

![Create a Test](../img/create-test-0.6.png)
The "Create New Test" dialog appears:

![Create a Test](../img/create-test-0.11.png)

The option to choose the kind of trigger to initiate the trace is presented:

- HTTP Request - Create a basic HTTP request.
- RPC Request - Test and debug your RPC request.
- GRPC Request - Test and debug your GRPC request.
- cURL Command - Define your HTTP test via a cURL command.
- Postman Collection - Define your HTTP request via a Postman collection.
- TraceID - Define you test via a TraceID.

Choose the trigger and click **Next**:

![Choose Trigger](../img/choose-trigger-0.6.png)
![Choose Trigger](../img/choose-trigger-0.11.png)

In this example, HTTP Request has been chosen.

![Choose Example](../img/choose-example-0.6.png)
![Choose Example](../img/choose-example-0.11.png)

Input the **Name** of the test and the **Description** or select one of the example provided in the drop down:

![Choose Example Pokemon](../img/choose-example-pokemon-0.6.png)
![Choose Example Pokemon](../img/choose-example-pokemon-0.11.png)

The **Pokemon - List** example has been chosen. Then click **Next**.
The **Pokemon - Import** example has been chosen. Then click **Next**.

![Choose Example Pokemon](../img/choose-example-pokemon-list-0.6.png)
![Choose Example Pokemon](../img/choose-example-pokemon-import-0.11.png)

Add any additional information and click **Create**:
Add any additional information and click **Create & Run**:

![Create Test](../img/provide-addl-information-0.6.png)
![Create Test](../img/provide-addl-information-0.11.png)

The test will start:

![Awaiting Trace](../img/awaiting-trace-0.6.png)
![Awaiting Trace](../img/awaiting-trace-0.11.png)

When the test is finished, you will get the following results:

![Finished Trace](../img/finished-trace-0.6.png)
![Finished Trace](../img/finished-trace-0.11.png)

Please visit the [Test Results](test-results.md) document for an explanation of viewing the results of a test..
Please visit the [Test Results](test-results.md) document for an explanation of viewing the results of a test.
16 changes: 8 additions & 8 deletions docs/docs/web-ui/exporting-tests.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
# Exporting Tests

Tracetest allows you to export the different set of information displayed for assertions and checks for a way you can use it as input for other tools and create text based tests to use on your CI/CD pipelines using the CLI and more options.
Tracetest allows you to export the different set of information displayed for assertions and checks to use it as input for other tools and create text-based tests to use on your CI/CD pipelines using the CLI and more options.

The current supported exports are:
1. JUnit results XML.
2. Test Definition YAML.

To access any of the available exports, go to the run/trace page details for any test and, at the header level, you'll find a three dot menu which will display the options.
To access any of the available exports, go to the run/trace page details for any test and, at the top right next to "Run Test", you'll find a settings icon which will display the options.

![Export Trace Options](../img/exports-trace-options.png)
![Export Trace Options](../img/export-trace-options-0.11.png)

## JUnit Results XML
To access the JUnit XML file, select the JUnit option from the dropdown and you'll find the file viewer modal with the location to download the file.
To access the JUnit XML file, select the "JUnit Results" option from the dropdown and you'll find the file viewer modal with the location to download the file.
The JUnit report contains the results from each of the assertions added to the test and their statuses. Depending on how many assertions the test has, this file will grow.

![Export Trace JUnit](../img/exports-junit.png)
![Export Trace JUnit](../img/exports-junit-0.11.png)

## Test Definition YAML
The Tracetest CLI allows you to execute text based tests. This means you can store all of your tests in a repo, keep track of the different versions and use them for your CI/CD process.
An easy way to start is to export the test definition directly from the UI by selecting the option from the dropdown.
The Tracetest CLI allows you to execute text-based tests. This means you can store all of your tests in a repo, keep track of the different versions and use them for your CI/CD process.
An easy way to start is to export the test definition directly from the UI by selecting the "Test Definition" option from the dropdown.
The file viewer modal will pop up and you can copy paste or download the file.

![Export Trace Test Definition](../img/exports-test-definition.png)
![Export Trace Test Definition](../img/exports-test-definition-0.11.png)
23 changes: 13 additions & 10 deletions docs/docs/web-ui/test-results.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,44 +2,47 @@

From the **All Tests** screen, you can access all your existing tests, create new tests and see the results of any test that has been run.

![All Tests List](../img/all-tests-list-0.6.png)
![All Tests List](../img/all-tests-list-0.11.png)

CLick on the settings icon to the right of each test. You can delete the test from here:

![Tests Actions](../img/tests-actions-0.6.png)
![Tests Actions](../img/tests-actions-0.11.png)

Click on the arrow next to the test name and the list of test runs will appear:

![Select Test](../img/select-test-0.6.png)
![Select Test](../img/select-test-0.11.png)

Click on a test run and the Trigger Details screen will open. From here, you can change and save the details of the test. On the top right, there is a button to run the test and a settings icon with the following options:

- JUnit Results - The test results in JUnit format.
- Test Definition - The test defintion YAML file.
- Edit - Edit the test.
- Delete - Delete the test.

![Run Tests & Options](../img/run-test-and-option-0.6.png)
![Run Tests & Options](../img/run-test-and-option-0.11.png)

Click on the **Trace** tab to open the Trace Details screen:

![Trace Tab View](../img/trace-tab-0.6.png)
![Trace Tab View](../img/trace-tab-0.11.png)

Use the icons at the top right to manipulate the graph. The options are:
Use the icons at the bottom left to manipulate the test image. The options are:

- Graph View
- Timeline View
- Zoom In
- Zoom Out
- Fit View
- Mini Map

![Trace Tab Icons](../img/trace-tab-icons-0.6.png)
![Trace Tab Icons](../img/trace-tab-icons-0.11.png)

Use the toggle button highlighted below to switch to the **Timeline View**:
The following shows the test in the **Timeline View**:

![Timeline View](../img/timeline-view-0.6.png)
![Timeline View](../img/timeline-view-0.11.png)

Click on the **Test** tab to see the details of Test Specs and Assertions for the test:

![Test Tab](../img/test-tab-0.6.png)
![Test Tab](../img/test-tab-0.11.png)

<!-- The test results include:
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/web-ui/undefined-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Undefined variables are dependent on the environment selected and whether or not

### **Supply Variable Value at Runtime**

A user wants a test or transaction they can run on a particular user, order id, etc that is configurable at run time. This would make running an adhoc test in an environment, even production, very easy and convenient. In this case, the user would reference the variable, but not add it to the environment. Each time they run the test or transaction, they would be prompted for the unspecified variables.
A user wants a test or transaction they can run on a particular user, order id, etc. that is configurable at run time. This makes running an adhoc test in an environment, even production, very easy and convenient. In this case, the user references the variable, but doesn't add it to the environment. Each time they run the test or transaction, they will be prompted for the unspecified variables.

### **Supply Variable Value from a Previous Test**

Expand All @@ -30,7 +30,7 @@ In Tracetest, undefined variables can be used in both the UI and CLI.

![Create Test Spec Assertionsl](../img/create-test-spec-assertions.png)

3. Create a GRPC pokemon add test that uses environment variables for hostname and pokemon name:
3. Create a GRPC Pokemon add test that uses environment variables for hostname and Pokemon name:

![Create GRPC](../img/create-grpc.png)

Expand Down

0 comments on commit cc5363d

Please sign in to comment.