The public API of the next generation java.jdbc library.
The basic building blocks are the java.sql
/javax.sql
classes:
DataSource
-- something to get connections from,Connection
-- an active connection to the database,PreparedStatement
-- SQL and parameters combined, from a connection,and the following functions and a macro:
get-datasource
-- given a hash map describing a database or a JDBC
connection string, construct a javax.sql.DataSource
and return it,get-connection
-- given a connectable, obtain a new java.sql.Connection
from it and return that,plan
-- given a connectable and SQL + parameters or a statement,
return a reducible that, when reduced will execute the SQL and consume
the ResultSet
produced,execute!
-- given a connectable and SQL + parameters or a statement,
execute the SQL, consume the ResultSet
produced, and return a vector
of hash maps representing the rows (@1); this can be datafied to allow
navigation of foreign keys into other tables (either by convention or
via a schema definition),execute-one!
-- given a connectable and SQL + parameters or a statement,
execute the SQL, consume the first row of the ResultSet
produced, and
return a hash map representing that row; this can be datafied to allow
navigation of foreign keys into other tables (either by convention or
via a schema definition),prepare
-- given a Connection
and SQL + parameters, construct a new
PreparedStatement
; in general this should be used with with-open
,transact
-- the functional implementation of with-transaction
,with-transaction
-- execute a series of SQL operations within a transaction.@1 result sets are built, by default, as vectors of hash maps, containing qualified keywords as column names, but the row builder and result set builder machinery is open and alternatives are provided to produce unqualified keywords as column names, and to produce a vector the column names followed by vectors of column values for each row, and lower-case variants of each.
The following options are supported wherever a Connection
is created:
:auto-commit
-- either true
or false
,:read-only
-- either true
or false
,:connection
-- a hash map of camelCase properties to set, via reflection,
on the Connection
object after it is created.The following options are supported wherever a Statement
or
PreparedStatement
is created:
:concurrency
-- :read-only
, :updatable
,:cursors
-- :close
, :hold
:fetch-size
-- the fetch size value,:max-rows
-- the maximum number of rows to return,:result-type
-- :forward-only
, :scroll-insensitive
, :scroll-sensitive
,:timeout
-- the query timeout,:statement
-- a hash map of camelCase properties to set, via reflection,
on the Statement
or PreparedStatement
object after it is created.In addition, wherever a PreparedStatement
is created, you may specify:
:return-keys
-- either true
or a vector of key names to return.The public API of the next generation java.jdbc library. The basic building blocks are the `java.sql`/`javax.sql` classes: * `DataSource` -- something to get connections from, * `Connection` -- an active connection to the database, * `PreparedStatement` -- SQL and parameters combined, from a connection, and the following functions and a macro: * `get-datasource` -- given a hash map describing a database or a JDBC connection string, construct a `javax.sql.DataSource` and return it, * `get-connection` -- given a connectable, obtain a new `java.sql.Connection` from it and return that, * `plan` -- given a connectable and SQL + parameters or a statement, return a reducible that, when reduced will execute the SQL and consume the `ResultSet` produced, * `execute!` -- given a connectable and SQL + parameters or a statement, execute the SQL, consume the `ResultSet` produced, and return a vector of hash maps representing the rows (@1); this can be datafied to allow navigation of foreign keys into other tables (either by convention or via a schema definition), * `execute-one!` -- given a connectable and SQL + parameters or a statement, execute the SQL, consume the first row of the `ResultSet` produced, and return a hash map representing that row; this can be datafied to allow navigation of foreign keys into other tables (either by convention or via a schema definition), * `prepare` -- given a `Connection` and SQL + parameters, construct a new `PreparedStatement`; in general this should be used with `with-open`, * `transact` -- the functional implementation of `with-transaction`, * `with-transaction` -- execute a series of SQL operations within a transaction. @1 result sets are built, by default, as vectors of hash maps, containing qualified keywords as column names, but the row builder and result set builder machinery is open and alternatives are provided to produce unqualified keywords as column names, and to produce a vector the column names followed by vectors of column values for each row, and lower-case variants of each. The following options are supported wherever a `Connection` is created: * `:auto-commit` -- either `true` or `false`, * `:read-only` -- either `true` or `false`, * `:connection` -- a hash map of camelCase properties to set, via reflection, on the `Connection` object after it is created. The following options are supported wherever a `Statement` or `PreparedStatement` is created: * `:concurrency` -- `:read-only`, `:updatable`, * `:cursors` -- `:close`, `:hold` * `:fetch-size` -- the fetch size value, * `:max-rows` -- the maximum number of rows to return, * `:result-type` -- `:forward-only`, `:scroll-insensitive`, `:scroll-sensitive`, * `:timeout` -- the query timeout, * `:statement` -- a hash map of camelCase properties to set, via reflection, on the `Statement` or `PreparedStatement` object after it is created. In addition, wherever a `PreparedStatement` is created, you may specify: * `:return-keys` -- either `true` or a vector of key names to return.
Standard implementations of get-datasource
and get-connection
.
Also provides dbtypes
as a map of all known database types, and
the ->pool
function for creating pooled datasource objects.
Standard implementations of `get-datasource` and `get-connection`. Also provides `dbtypes` as a map of all known database types, and the `->pool` function for creating pooled datasource objects.
Optional namespace that extends next.jdbc.prepare/SettableParameter
to various date/time types so that they will all be treated as SQL
timestamps (which also supports date and time column types).
Some databases support a wide variety of date/time type conversions. Other databases need a bit of help. You should only require this namespace if you database does not support these conversions automatically.
Instant
, LocalDate
,
LocalDateTime
) out of the box,java.util.Date
out of
the box -- except PostgreSQL apparently!Types supported:
java.time.Instant
java.time.LocalDate
java.time.LocalDateTime
java.util.Date
-- mainly for PostgreSQLPostgreSQL does not seem able to convert java.util.Date
to a SQL
timestamp by default (every other database can!) so you'll probably
need to require this namespace, even if you don't use Java Time.
Optional namespace that extends `next.jdbc.prepare/SettableParameter` to various date/time types so that they will all be treated as SQL timestamps (which also supports date and time column types). Some databases support a wide variety of date/time type conversions. Other databases need a bit of help. You should only require this namespace if you database does not support these conversions automatically. * H2 and SQLite support conversion of Java Time (`Instant`, `LocalDate`, `LocalDateTime`) out of the box, * Nearly all databases support conversion of `java.util.Date` out of the box -- except PostgreSQL apparently! Types supported: * `java.time.Instant` * `java.time.LocalDate` * `java.time.LocalDateTime` * `java.util.Date` -- mainly for PostgreSQL PostgreSQL does not seem able to convert `java.util.Date` to a SQL timestamp by default (every other database can!) so you'll probably need to require this namespace, even if you don't use Java Time.
No vars found in this namespace.
Builders that treat NULL SQL values as 'optional' and omit the corresponding keys from the Clojure hash maps for the rows.
Builders that treat NULL SQL values as 'optional' and omit the corresponding keys from the Clojure hash maps for the rows.
Mostly an implementation namespace for how PreparedStatement
objects are
created by the next generation java.jdbc library.
set-parameters
is public and may be useful if you have a PreparedStatement
that you wish to reuse and (re)set the parameters on it.
execute-batch!
provides a way to add batches of parameters to a
PreparedStatement
and then execute it in batch mode (via .executeBatch
).
Defines the SettableParameter
protocol for converting Clojure values
to database-specific values.
Mostly an implementation namespace for how `PreparedStatement` objects are created by the next generation java.jdbc library. `set-parameters` is public and may be useful if you have a `PreparedStatement` that you wish to reuse and (re)set the parameters on it. `execute-batch!` provides a way to add batches of parameters to a `PreparedStatement` and then execute it in batch mode (via `.executeBatch`). Defines the `SettableParameter` protocol for converting Clojure values to database-specific values.
This is the extensible core of the next generation java.jdbc library.
Sourceable
-- for producing javax.sql.DataSource
objects,Connectable
-- for producing new java.sql.Connection
objects,Executable
-- for executing SQL operations,Preparable
-- for producing new java.sql.PreparedStatement
objects,Transactable
-- for executing SQL operations transactionally.This is the extensible core of the next generation java.jdbc library. * `Sourceable` -- for producing `javax.sql.DataSource` objects, * `Connectable` -- for producing new `java.sql.Connection` objects, * `Executable` -- for executing SQL operations, * `Preparable` -- for producing new `java.sql.PreparedStatement` objects, * `Transactable` -- for executing SQL operations transactionally.
Provides functions for use with the :table-fn
and :column-fn
options
that define how SQL entities should be quoted in strings constructed
from Clojure data.
Provides functions for use with the `:table-fn` and `:column-fn` options that define how SQL entities should be quoted in strings constructed from Clojure data.
An implementation of ResultSet
handling functions.
Defines the following protocols:
DatafiableRow
-- for turning a row into something datafiableReadableColumn
-- to read column values by label or indexRowBuilder
-- for materializing a rowResultSetBuilder
-- for materializing a result setA broad range of result set builder implementation functions are provided.
Also provides the default implemenations for Executable
and
the default datafy
/nav
behavior for rows from a result set.
An implementation of `ResultSet` handling functions. Defines the following protocols: * `DatafiableRow` -- for turning a row into something datafiable * `ReadableColumn` -- to read column values by label or index * `RowBuilder` -- for materializing a row * `ResultSetBuilder` -- for materializing a result set A broad range of result set builder implementation functions are provided. Also provides the default implemenations for `Executable` and the default `datafy`/`nav` behavior for rows from a result set.
Specs for the core API of next.jdbc.
The functions from next.jdbc
, next.jdbc.sql
, and next.jdbc.prepare
have specs here.
Just :args
are spec'd. These specs are intended to aid development
with next.jdbc
by catching simple errors in calling the library.
The connectable
argument is currently just any?
but both
get-datasource
and get-connection
have stricter specs. If you
extend Sourceable
or Connectable
, those specs will likely be too strict.
In addition, there is an instrument
function that provides a simple
way to instrument all of the next.jdbc
functions, and unstrument
to undo that.
Specs for the core API of next.jdbc. The functions from `next.jdbc`, `next.jdbc.sql`, and `next.jdbc.prepare` have specs here. Just `:args` are spec'd. These specs are intended to aid development with `next.jdbc` by catching simple errors in calling the library. The `connectable` argument is currently just `any?` but both `get-datasource` and `get-connection` have stricter specs. If you extend `Sourceable` or `Connectable`, those specs will likely be too strict. In addition, there is an `instrument` function that provides a simple way to instrument all of the `next.jdbc` functions, and `unstrument` to undo that.
Some utility functions that make common operations easier by
providing some syntactic sugar over execute!
/execute-one!
.
This is intended to provide a minimal level of parity with
clojure.java.jdbc
(insert!
, insert-multi!
, query
, find-by-keys
,
get-by-id
, update!
, and delete!
).
For anything more complex, use a library like HoneySQL https://github.com/jkk/honeysql to generate SQL + parameters.
The following options are supported:
:table-fn
-- specify a function used to convert table names (strings)
to SQL entity names -- see the next.jdbc.quoted
namespace for the
most common quoting strategy functions,:column-fn
-- specify a function used to convert column names (strings)
to SQL entity names -- see the next.jdbc.quoted
namespace for the
most common quoting strategy functions.In addition, find-by-keys
supports :order-by
to add an ORDER BY
clause to the generated SQL.
Some utility functions that make common operations easier by providing some syntactic sugar over `execute!`/`execute-one!`. This is intended to provide a minimal level of parity with `clojure.java.jdbc` (`insert!`, `insert-multi!`, `query`, `find-by-keys`, `get-by-id`, `update!`, and `delete!`). For anything more complex, use a library like HoneySQL https://github.com/jkk/honeysql to generate SQL + parameters. The following options are supported: * `:table-fn` -- specify a function used to convert table names (strings) to SQL entity names -- see the `next.jdbc.quoted` namespace for the most common quoting strategy functions, * `:column-fn` -- specify a function used to convert column names (strings) to SQL entity names -- see the `next.jdbc.quoted` namespace for the most common quoting strategy functions. In addition, `find-by-keys` supports `:order-by` to add an `ORDER BY` clause to the generated SQL.
Some utility functions for building SQL strings.
These were originally private functions in next.jdbc.sql
but
they may proof useful to developers who want to write their own
'SQL sugar' functions, such as a database-specific upsert!
etc.
Some utility functions for building SQL strings. These were originally private functions in `next.jdbc.sql` but they may proof useful to developers who want to write their own 'SQL sugar' functions, such as a database-specific `upsert!` etc.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close