-
Notifications
You must be signed in to change notification settings - Fork 132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for BigQuery adapter #172
Changes from 15 commits
dfcf105
8a61d11
4a96c83
0d5e732
2ef840c
2eaa540
0860a43
66443f5
15ef765
cb1718e
e906195
e618976
a77e76f
bcddd48
bf1f58c
fa8d211
6482d5f
a34e8de
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -162,4 +162,3 @@ jobs: | |
title: "SQLFluff Lint" | ||
input: "./annotations.json" | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -9,3 +9,5 @@ logs/ | |
Pipfile | ||
Pipfile.lock | ||
env.sh | ||
|
||
.python_version |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,7 @@ | ||
{% macro insert_into_metadata_table(database_name, schema_name, table_name, content) -%} | ||
{{ return(adapter.dispatch('insert_into_metadata_table', 'dbt_artifacts')(database_name, schema_name, table_name, content)) }} | ||
{% if content != "" %} | ||
{{ return(adapter.dispatch('insert_into_metadata_table', 'dbt_artifacts')(database_name, schema_name, table_name, content)) }} | ||
{% endif %} | ||
{%- endmacro %} | ||
|
||
{% macro spark__insert_into_metadata_table(database_name, schema_name, table_name, content) -%} | ||
|
@@ -19,3 +21,16 @@ | |
|
||
{% do run_query(insert_into_table_query) %} | ||
{%- endmacro %} | ||
|
||
{% macro bigquery__insert_into_metadata_table(database_name, schema_name, table_name, content) -%} | ||
|
||
{% set insert_into_table_query %} | ||
insert into {{database_name}}.{{ schema_name }}.{{ table_name }} | ||
VALUES | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi @charles-astrafy I was just playing around with your branch, and I got this query from from the statement that stores the tests (I've generalized the info yo you can play with it in your BQ instance):
However this query produces an error, saying that
Is there something I'm doing wrong? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @adrpino : I think you are doing smth wrong because the DML that insert values in the raw table for BigQuery does not use the syntax So it seems you are not running the code with the BigQuery adapter as it is taking the code from the default adapter. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi @charles-astrafy thanks! is there any way in which I have to enforce the BQ adapter? isn't this done by default? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi @adrpino , The adapter that dbt will use is defined via your profiles.yaml file. You define it in the "type" parameter of the target definition within that profiles.yaml file (see screenshot in attachment). Let me know if you still have issues. |
||
{{ content }} | ||
{% endset %} | ||
|
||
{% do run_query(insert_into_table_query) %} | ||
|
||
{%- endmacro %} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Going to remove this for now but feel free to add to a separate PR @charles-astrafy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, will put it in a separate PR.