updated documentation

This commit is contained in:
brianc 2010-10-28 19:46:50 -05:00
parent da8026df9d
commit 827e7d7c1d

303
README.md
View File

@ -1,17 +1,51 @@
#node-postgres #node-postgres
100% javascript. 100% async. 100% would love your contributions. Non-blocking (async) JavaScript PostgreSQL client for node.js written fully TDD
## ALPHA version ## alpha version
Implemented in a fully TDD fashion. I'm aiming for ### Whirlwind tour
extremely high quality code, but first doing the implementation and
only refactoring after tests are in place.
##### Installation var Client = require('node-postgres').Client;
var client = new Client({
user: 'brianc',
database: 'test',
password: 'boom' //plaintext or md5 supported
});
Clone the repo. There are __no__ dependencies. client.connect();
var printRow = function(row) {
console.log(row.fields);
};
var simpleQuery = client.query("select * from user where heart = 'big'");
simpleQuery.on('row', printRow);
var preparedStatement = client.query({
name: 'user by heart type',
text: 'select * from user where heart = $1',
values: ['big']
});
preparedStatement.on('row', printRow);
var cachedPreparedStatement = client.query({
name: 'user by heart type',
values: ['filled with kisses']
});
cachedPreparedStatement.on('row', printRow);
cachedPreparedStatement.on('end', client.end());
### Philosophy
* well tested
* no monkey patching
* no dependencies (well...besides PostgreSQL)
### Installation
Clone the repo.
git clone git://github.com/brianc/node-postgres git clone git://github.com/brianc/node-postgres
cd node-postgres cd node-postgres
@ -19,262 +53,35 @@ Clone the repo. There are __no__ dependencies.
And just like magic, you're ready to contribute! <3 And just like magic, you're ready to contribute! <3
I don't have _style guidelines_ or anything right now. I'm 100x more ## More info please
concerned with test coverage, functionality, and happy coding than I
am about whether or not you've got the proper spacing after your `{ hash: 'separators' }`
### Connection Srsly check out the [[wiki]]. MUCH more information there.
The connection object is a 1 to 1 mapping to the [postgres p.s. want your own offline version of the wiki?
client/server messaging protocol](http://developer.postgresql.org/pgdocs/postgres/protocol.html).
The __Connection_ object is mostly used by the Client object (which...I haven't yet
finished implementing) but you can do anything you want with PostgreSQL using
the connection object if you're really into that. I studied the
protocol for a while implementing this and the documentation is pretty
solid. If you're already familiar you should be right at home. Have
fun looking up the [oids for the datatypes in your bound queries](http://github.com/brianc/node-postgres/blob/master/script/list-db-types.js)
There are a few minor variations from the protocol: git clone git://github.com/brianc/node-postgres.wiki.git
- The connection only supports 'text' mode right now. __github is magic__
- Renamed 'passwordMessage' to 'password'
- Renamed 'startupMessage' to 'startup'
- Renamed 'errorResposne' to 'error'
- Renamed 'noticeResponce' to 'notice'
The reason for the renamings is 90% of the message names in the ### Why?
protocol do no contain "message" "request" "response" or anything
similar, and I feel it's a bit redundant to send a "passwordMessage
message." But then again...[I do say ATM machine](http://en.wikipedia.org/wiki/RAS_syndrome).
Anyways...using a connection directly is a pretty verbose and
cumbersom affair. Here's an example of executing a prepared query
using the directly __Connection__ api in compliance with
PostgreSQL.
_note: this works and is taken directly from an integration test;
however, it doesn't include error handling_
var con = new Connection({stream: new net.Stream()});
con.connect('5432','localhost');
con.once('connect', function() {
con.startup({
user: username,
database: database
});
con.once('readyForQuery', function() {
con.query('create temp table ids(id integer)');
con.once('readyForQuery', function() {
con.query('insert into ids(id) values(1); insert into ids(id) values(2);');
con.once('readyForQuery', function() {
con.parse({
text: 'select * from ids'
});
con.flush();
con.once('parseComplete', function() {
con.bind();
con.flush();
});
con.once('bindComplete', function() {
con.execute();
con.flush();
});
con.once('commandComplete', function() {
con.sync();
});
con.once('readyForQuery', function() {
con.end();
});
});
});
});
});
### Client
Basically a facade on top of the connection to provide a _much_ more
user friendly, "node style" interface for doing all the lovely things
you like with PostgreSQL.
Now that I've got the __Connection__ api in place, the bulk and meat of
the work is being done on the __Client__ to provide the best possible
API. Help? Yes please!
var client = new Client({
user: 'brian',
database: 'postgres',
});
client.query("create temp table ids(id integer)");
client.query("insert into ids(id) values(1)");
client.query("insert into ids(id) values(2)");
var query = client.query("select * from ids", function(row) {
row.fields[0] // <- that equals 1 the first time. 2 the second time.
});
query.on('end', function() {
client.end();
});
#### Prepared statements
I'm still working on the API for prepared statements. Check out the
tests for more up to date examples, but what I'm working towards is
something like this:
var client = new Client({
user: 'brian',
database: 'test'
});
var query = client.query({
text: 'select * from person where age < $1',
values: [21]
});
query.on('row', function(row) {
console.log(row);
});
query.on('end', function() { client.end() });
## Testing
The tests are split up into two different Unit test and
integration tests.
### Unit tests
Unit tests do not depend on having access to a
running PostgreSQL server. They work by mocking out the `net.Stream`
instance into a `MemoryStream`. The memory stream raises 'data'
events with pre-populated packets which simulate communcation from an
actual PostgreSQL server. Some tests will validate incomming packets
are parsed correctly by the __Connection__ and some tests validate the
__Connection__ correctly sends outgoing packets to the stream.
### Integration tests
The integration tests operate on an actual database and require
access. They're under a bit more flux as the api for the client is
changing a bit; however, they should always all be passing on every
push up to the ol' githubber.
### Running tests
You can run any test file directly by doing the `node
test/unit/connection/inbound-parser-tests.js` or something of the
like.
However, you can specify command line arguments after the file
and they will be picked up and used in the tests. None of the
arguments are used in _unit_ tests, so you're safe to just blast away
with the command like above, but if you'd like to execute an
_integration_ test, you outta specifiy your database, user to use for
testing, and optionally a password.
To do so you would do something like so:
node test/integration/client/simple-query-tests.js -u brian -d test_db
If you'd like to execute all the unit or integration tests at one
time, you can do so with the "run.js" script in the /test directory as
follows:
##### Run all unit tests
node test/run.js -t unit
or optionally, since `-t unit` is the default
node test/run.js
##### Run all integration tests
node test/run.js -t integration -u brian -d test_db --password password!
##### Run all the tests!
node test/run.js -t all -u brian -d test_db --password password!
In short, I tried to make executing the tests as easy as possible.
Hopefully this will encourage you to fork, hack, and do whatever you
please as you've got a nice, big safety net under you.
#### Test data
In order for the integration tests to not take ages to run, I've
pulled out the script used to generate test data. This way you can
generate a "test" database once and don't have to up/down the tables
every time an integration test runs. To run the generation script,
execute the script with the same command line arguments passed to any
other test script.
node script/create-test-tables.js -u user -d database
Aditionally if you want to revert the test data, you'll need to "down"
the database first and then re-create the data as follows:
node script/create-test-tables.js -u user -d database --down
node script/create-test-tables.js -u user -d database
## TODO
- Query results returned
- some way to return number of rows inserted/updated etc
(supported in protocol and handled in __Connection__ but not sure
where on the __Client__ api to add this functionality)
- Typed result set support in client
- simple queries
- bound commands
- edge cases
- [numeric 'NaN' result](http://www.postgresql.org/docs/8.4/static/datatype-numeric.html)
- float Infinity, -Infinity
- Error handling
- disconnection, removal of listeners on errors
- passing errors to callbacks?
- more integration testing
- bound command support in client
- type specification
- parameter specification
- transparent bound command caching?
- nice "cursor" (portal) api
- connection pooling
- copy data?
- kiss the sky
## Why?
As soon as I saw node.js for the first time I knew I had found As soon as I saw node.js for the first time I knew I had found
something lovely and simple and _just what I always wanted!_. So...I something lovely and simple and _just what I always wanted!_. So...I
poked around for a while. I was excited. I told my friend "ah man poked around for a while. I was excited. I still am!
the only thing holding node back is a really solid data access story."
I mean...let's put the NoSQL debate aside. Let's say for arguments Let's say for arguments sake you have to run a query from node.js on PostgreSQL before the
sake you have to run a query from node.js on PostgreSQL before the
last petal falls off the rose and you are stuck as a beast forever? last petal falls off the rose and you are stuck as a beast forever?
You can't use NoSQL because your boss said he'd pour a cup of
Hoegarten into your laptop fan vent and you _hate_ that beer?
What if your entire production site depends on it? Well, fret no What if your entire production site depends on it? Well, fret no
more. And let [GastonDB](http://www.snipetts.com/ashley/mymusicals/disney/beauty-an-beast/images/gaston.gif) be vanquished. more. And let [GastonDB](http://www.snipetts.com/ashley/mymusicals/disney/beauty-an-beast/images/gaston.gif) be vanquished.
I drew major inspiration from I drew major inspiration from
[postgres-js](http://github.com/creationix/postgres-js). I didn't [postgres-js](http://github.com/creationix/postgres-js). I didn't
just fork and contribute because it has just fork and contribute because it has
__0__ tests included with it and doesn't seem to be actively developed _0_ tests included with it, adds a bunch of methods to the Buffer()
anymore. I am not comfortable forking & playing with a project object, and doesn't seem to be maintained. Still...was a lovely way
without having a way to run a test suite, let alone using it in to learn & excellent reference material.
production.
I also drew some major inspirrado from I also drew some major inspirrado from
[node-mysql](http://github.com/felixge/node-mysql) and liked what I [node-mysql](http://github.com/felixge/node-mysql) and liked what I