You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
payload:= strings.NewReader("{\n\"query_id\": 1252207,\n,\n\"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
76
+
payload:= strings.NewReader("{\n\"query_id\": 1252207,\n\"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
77
77
78
78
req, _:= http.NewRequest("PATCH", url, payload)
79
79
@@ -104,7 +104,7 @@ curl_setopt_array($curl, [
104
104
CURLOPT_TIMEOUT => 30,
105
105
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
106
106
CURLOPT_CUSTOMREQUEST => "PATCH",
107
-
CURLOPT_POSTFIELDS => "{\n \"query_id\": 1252207,\n \"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}",
107
+
CURLOPT_POSTFIELDS => "{\n \"query_id\": 1252207,\n \"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}",
.body("{\n\"query_id\": 1252207,\n\"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
130
+
.body("{\n\"query_id\": 1252207,\n\"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
Copy file name to clipboardExpand all lines: api-reference/quickstart/queries-eg.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ In this quickstart, we will walk through how to turn any dashboard (or set of qu
6
6
7
7
### Prerequisites
8
8
- Python environment set up (check out [Anaconda Navigator](https://docs.continuum.io/free/navigator/) if you want somewhere to start.)
9
-
- Have a Dune API key from the team/user who's queries you want to manage (to obtain one [follow the steps here](../overview/authentication#generate-an-api-key))
9
+
- Have a Dune API key from the team/user whose queries you want to manage (to obtain one [follow the steps here](../overview/authentication#generate-an-api-key))
Copy file name to clipboardExpand all lines: api-reference/quickstart/tables-eg.mdx
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ DUNE_API_KEY=<paste your API key here>
16
16
</Tip>
17
17
18
18
### Upload the CSV
19
-
Follow below steps to upload your CSV. Please make sure to modify paths to your .env file and to your CSV file.
19
+
Follow the below steps to upload your CSV. Please make sure to modify paths to your .env file and to your CSV file.
20
20
21
21
```python
22
22
import dotenv, os
@@ -45,7 +45,7 @@ with open(csv_file_path) as open_file:
45
45
46
46
Once the upload is successful, you will see the data show up under [Your Data](https://dune.com/queries?category=uploaded_data) in the Data Explorer.
47
47
48
-
You can query your uploaded table under the name `dune.<team or user handle>.dataset_<table name defined>`. For example, here I defined the table name to be "cereal_table" and my team name is "dune", so to access the uploaded table we will do `select * from dune.dune.dataset_cereal_table`
48
+
You can query your uploaded table under the name `dune.<team or user handle>.dataset_<table name defined>`. For example, here I defined the table name to be "cereal_table" and my team name is "dune", so to access the uploaded table we will do `select * from dune.dune.dataset_cereal_table`.
Copy file name to clipboardExpand all lines: api-reference/tables/endpoint/insert.mdx
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ To be able to insert into a table, it must have been created with the [/create e
7
7
8
8
<Note>
9
9
- The data in the files must conform to the schema, and must use the same column names as the schema.
10
-
- One successful `/insert` request consumes 1 credits.
10
+
- One successful `/insert` request consumes 1 credit.
11
11
- The maximum request size is 1.2GB
12
12
</Note>
13
13
@@ -19,7 +19,7 @@ A status code of 200 means that the data in the request was successfully inserte
19
19
If you get any other status code, you can safely retry your request after addressing the issue that the error message indicated.
20
20
21
21
## Concurrent requests
22
-
A limited number of concurrent insertion requests per table is supported. However, there will be a slight performance penalty as we serialize the writes behind the scenes to ensure data integrity. Larger number of concurrent requests per table may result in an increased number of failures. Therefore, we recommend managing your requests within a 5-10 threshold to maintain optimal performance.
22
+
A limited number of concurrent insertion requests per table is supported. However, there will be a slight performance penalty as we serialize the writes behind the scenes to ensure data integrity. A larger number of concurrent requests per table may result in an increased number of failures. Therefore, we recommend managing your requests within a 5-10 threshold to maintain optimal performance.
23
23
24
24
## Supported filetypes
25
25
### CSV files (`Content-Type: text/csv`)
@@ -32,15 +32,15 @@ Each line must have keys that match the column names of the table.
32
32
## Data types
33
33
DuneSQL supports a variety of types which are not natively supported in many data exchange formats. Here we provide guidance on how to work with such types.
34
34
### Varbinary values
35
-
When uploading varbinary data using JSON or CSV formats, you need to convert the binary data into a textual representation. Reason being, JSON or CSV don't natively support binary values. There are many ways to transform binary data to a textual representation. We support **hexadecimal** and **base64** encodings.
35
+
When uploading varbinary data using JSON or CSV formats, you need to convert the binary data into a textual representation. The reason being, JSON or CSV don't natively support binary values. There are many ways to transform binary data to a textual representation. We support **hexadecimal** and **base64** encodings.
36
36
37
37
#### base64
38
38
Base64 is a binary-to-text encoding scheme that transforms binary data into a sequence of characters. All characters are taken from a set of 64 characters.
Copy file name to clipboardExpand all lines: api-reference/tables/endpoint/upload.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ For working with uploads, keep in mind that:
16
16
- File has to be < 200 MB
17
17
- Column names in the table can't start with a special character or digits.
18
18
- Private uploads require a Premium subscription.
19
-
- If you upload to an existing table name, it will delete the old data and overwite it with your new data. Appends are only supported for the `/create`, `/insert` endpoints.
19
+
- If you upload to an existing table name, it will delete the old data and overwrite it with your new data. Appends are only supported for the `/create`, `/insert` endpoints.
20
20
- To delete an upload table, you must go to `user settings (dune.com) -> data -> delete`.
21
21
22
22
If you have larger datasets you want to upload, please [contact us here](https://docs.google.com/forms/d/e/1FAIpQLSekx61WzIh-MII18zRj1G98aJeLM7U0VEBqaa6pVk_DQ7lq6Q/viewform)
0 commit comments