Update 03-batch-queries.md

This commit is contained in:
Iñigo Medina (aka MacGyver) 2018-06-25 15:04:30 +02:00 committed by GitHub
parent b880323080
commit 2c507d1d0c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -4,7 +4,7 @@ A Batch Query enables you to request queries with long-running CPU processing ti
_Batch Queries are not intended to be used for large query payloads that contain over 16384 characters (16kb). For instance, if you are inserting a large number of rows into your table, you still need to use the [Import API](https://carto.com/docs/carto-engine/import-api/) or [SQL API]({{ site.sqlapi_docs }}/guides/) for this type of data management. Batch Queries are specific to queries and CPU usage._
**Note:** In order to use Batch Queries, you **must** be [authenticated]({{ site.sqlapi_docs }}/guides/authentication/) using API keys.
**Note:** SQL API does not expose any endpoint to list Batch Queries (jobs). Thus, when creating a Batch Query (job), you must always save the ID from the response, as the main reference for any later operation.
### Authentication
@ -149,6 +149,9 @@ request(options, function (error, response, body) {
});
```
**Note:** You must always save the `job_id` from the response for any later operation like reading or deleting a job. As mentioned above, SQL API does not expose any endpoint to list jobs.
#### Read a Job
To read a Batch Query job, make a GET request with the following parameters.