A DynamoDB driver written in pure Clojure. Lightweight dependencies. GraalVM/native-image friendly.
Leiningen/Boot:
[com.github.igrishaev/dynamodb "0.1.2"]
Clojure CLI/deps.edn:
com.github.igrishaev/dynamodb {:mvn/version "0.1.2"}
At cljdoc.org (automatic build).
At the moment, only the most important API targets are implemented. The rest of them is a matter of time and copy-paste. Let me know if you need something missing in the table below.
Target | Done? | Comment |
---|---|---|
BatchExecuteStatement | ||
BatchGetItem | + | |
BatchWriteItem | ||
CreateBackup | + | |
CreateGlobalTable | ||
CreateTable | + | |
DeleteBackup | ||
DeleteItem | + | |
DeleteTable | + | |
DescribeBackup | + | |
DescribeContinuousBackups | ||
DescribeContributorInsights | ||
DescribeEndpoints | ||
DescribeExport | ||
DescribeGlobalTable | ||
DescribeGlobalTableSettings | ||
DescribeImport | ||
DescribeKinesisStreamingDestination | ||
DescribeLimits | ||
DescribeTable | + | |
DescribeTableReplicaAutoScaling | ||
DescribeTimeToLive | ||
DisableKinesisStreamingDestination | ||
EnableKinesisStreamingDestination | ||
ExecuteStatement | ||
ExecuteTransaction | ||
ExportTableToPointInTime | ||
GetItem | + | |
ImportTable | ||
ListBackups | ||
ListContributorInsights | ||
ListExports | ||
ListGlobalTables | ||
ListImports | ||
ListTables | + | |
ListTagsOfResource | ||
PutItem | + | |
Query | + | |
RestoreTableFromBackup | ||
RestoreTableToPointInTime | ||
Scan | + | |
TagResource | + | |
TransactGetItems | ||
TransactWriteItems | ||
UntagResource | ||
UpdateContinuousBackups | ||
UpdateContributorInsights | ||
UpdateGlobalTable | ||
UpdateGlobalTableSettings | ||
UpdateItem | + | |
UpdateTable | ||
UpdateTableReplicaAutoScaling | ||
UpdateTimeToLive |
DynamoDB is a part of Teleward — a Telegram captcha bot. The bot is hosted in Yandex Cloud as a binary file compiled with GraalVM. It uses the library to track the state in Yandex DB. In turn, Yandex DB is a cloud database that mimics DynamoDB and serves a subset of its HTTP API.
First, import the library:
(require '[dynamodb.api :as api])
(require '[dynamodb.constant :as const])
The constant
module is needed sometimes to refer to common DynamoDB values
like "PAY_PER_REQUEST"
, "PROVISIONED"
and so on.
The dynamodb.encode
namespace provides the IEncode
protocol with the
(-encode [this])
method. The library extends the Boolean
, String
, Number
and other types to encode them properly. There is also a default implementation
for Object
type that encodes any value as a string.
To establish your own encoding rules for some certain type, extend it with protocol as follows:
(extend-protocol IEncode
MyDateTimeType
(-encode [this]
(-encode (.milliseconds this))))
In the example above, a custom MyDateTimeType
type gets transferred into
milliseconds and gets encoded as a number. Thus, the result will be something
like {:N "1676628480718"}
.
At the moment, the values are decoded using the case
form with no way for
extension. This a subject to improve/rework this behaviour (see the
dynamodb.decode
namespace).
Prepare a client object. The first four parameters are mandatory:
(def CLIENT
(api/make-client "aws-public-key"
"aws-secret-key"
"https://aws.dynamodb.endpoint.com/some/path"
"aws-region"
{...}))
For Yandex DB, the region is something like "ru-central1".
Both public and secret AWS keys are masked with a special wrapper that prevents them from being logged or printed.
The fifth parameter is a map of options to override:
Parameter | Default | Description |
---|---|---|
:throw? | true | Whether to throw a negative DynamoDB response. |
:version | "20120810" | DynamoDB API version. |
:http-opt | (see below) | A map of HTTP Kit default settings. |
The default HTTP settings are:
{:user-agent "com.github.igrishaev/dynamodb"
:keepalive (* 30 1000)
:insecure? true
:follow-redirects false}
To create a new table, pass its name, the schema map, and the primary key mapping:
(api/create-table CLIENT
"SomeTable"
{:user/id :N
:user/name :S}
{:user/id const/key-type-hash
:user/name const/key-type-range}
{:tags {:foo "hello"}
:table-class const/table-class-standard
:billing-mode const/billing-mode-pay-per-request})
Tables can be listed by pages. The default page size is 100. Once you've reached
the limit, check out the LastEvaluatedTableName
field. Pass it to the
:start-table
optional argument to propagate to the next page:
(def resp1
(api/list-tables CLIENT {:limit 10}))
(def last-table
(:LastEvaluatedTableName resp1))
(def resp2
(api/list-tables CLIENT
{:limit 10
:start-table last-table}))
To upsert an item, pass a map that contains the primary attributes:
(api/put-item CLIENT
"SomeTable"
{:user/id 1
:user/name "Ivan"
:user/foo 1}
{:return-values const/return-values-none})
Pass :sql-condition
to make the operation conditional. In the example above,
the :user/foo
attribute is 1. The second upsert operation checks if
:user/foo
is either 1, 2, or 3, which is true. Thus, it will fail:
(api/put-item CLIENT
"SomeTable"
{:user/id 1
:user/name "Ivan"
:user/test 3}
{:sql-condition "#foo in (:one, :two, :three)"
:attr-names {"#foo" :user/foo}
:attr-values {":one" 1
":two" 2
":three" 3}
:return-values const/return-values-all-old})
To get an item, provide its primary key:
(api/get-item CLIENT
"SomeTable"
{:user/id 1
:user/name "Ivan"})
{:Item #:user{:id 1
:name "Ivan"
:foo 1}}
There is an option to get only the attributes you need or even sub-attributes for nested maps or lists:
;; put some complex values
(api/put-item CLIENT
"SomeTable"
{:user/id 1
:user/name "Ivan"
:test/kek "123"
:test/foo 1
:abc nil
:foo "lol"
:bar {:baz [1 2 3]}})
;; pass a list of attributes/paths into the `:attrs-get` param
(api/get-item CLIENT
"SomeTable"
{:user/id 1
:user/name "Ivan"}
{:attrs-get [:test/kek "bar.baz[1]" "abc" "foo"]})
;; the result:
{:Item {:test/kek "123"
:bar {:baz [2]}
:abc nil
:foo "lol"}}
This operation is the most complex. In AWS SDK or Faraday, to update an item's secondary attributes, one should manually build a SQL expression that involves string formatting, concatenation and similar boring stuff.
SET username = :username, email = :email, ...
The ADD
, DELETE
, and REMOVE
expressions require manual work as well.
The present library solves this problem for you. The update-item
function
accepts :add
, :set
, :delete
, and :remove
parameters, either maps or
vectors.
The :sql-condition
argument accepts a plain SQL expression. Should it
evaluates as falseness, the item won't be affected and you'll get a negative
response.
(api/update-item CLIENT
table
{:user/id 1
:user/name "Ivan"}
{:attr-names {"#counter" :test/counter}
:attr-values {":one" 1}
:set {"Foobar" 123
:user/email "test@test.com"
"#counter" (api/sql "#counter + :one")}})
The example above covers three various options for the :set
argument. Namely:
("Foobar")
, and the value is plain as well.:user/email
) which cannot be placed in
a SQL expression directly. Under the hood, the library produces an alias for
it and injects it into ExpressionAttributeNames
.api/sql
. The alias #counter
should be declared in the :attr-names
map.The :add
parameter accepts a map of an attribute or an alias to a
value. Imagine you have the following item in the db:
{:user/id 1
:user/name "Ivan"
:amount 3
:test/colors #{"r" "g"}}
To increase the amount and add a new color into the colors set, perfrom:
(api/update-item CLIENT
table
{:user/id 1
:user/name "Ivan"}
{:add {"amount" 1
:test/colors #{"b"}}})
Result:
{:Item
{:amount 4
:user/id 1
:test/colors #{"b" "r" "g"}
:user/name "Ivan"}}
To remove an attribute, pass the :remove
vector. Each item of that vector is
either a keyword attribute, a raw string expression, or an alias.
(api/update-item CLIENT
table
{:user/id 1
:user/name "Ivan"}
{:attr-names {"#kek" :test/kek}
:remove ["#kek" "abc" :test/lol]})
To remove an item from a list, pass a string like this:
;; item in the databalse
{:tags ["foo" "bar" "baz"]}
{:remove ["tags[1]"]}
Use an alias when the attribute is a keyword with a namespace:
(api/update-item CLIENT
table
{:user/id 1
:user/name "Ivan"}
{:attr-names {"#tags" :user/tags}
:remove ["#tags[1]"]})
In DynamoDB, the DELETE
clause is used to remove items from sets. The
update-item
function accepts the :delete
argument which is a map. The key is
either a keyword or a string alias. The value is always a set:
The item:
{:user/id 1
:user/name "Ivan"
:user/colors #{"r" "g" "b"}}
API call:
(api/update-item CLIENT
table
{:user/id 1
:user/name "Ivan"}
{:delete {:user/colors #{"r" "b"}}})
Result:
{:Item #:user{:colors #{"g"} :id 1 :name "Ivan"}}
Simple deletion of an item:
(api/delete-item CLIENT
table
{:user/id 1 :user/name "Ivan"})
Conditional deletion: throws an exception when the expression fails.
(api/put-item CLIENT
table
{:user/id 1
:user/name "Ivan"
:test/kek 99})
(api/delete-item CLIENT
table
{:user/id 1 :user/name "Ivan"}
{:sql-condition "#kek in (:foo, :bar, :baz)"
:attr-names {"#kek" :test/kek}
:attr-values {":foo" 1
":bar" 2
":baz" 3}})
In the example above, the "#kek in (:foo, :bar, :baz)"
expression fails as the
:test/kek
attribute is of value 99. The item stays in the database, and you'll
get an exception with ex-info:
{:error? true
:status 400
:path "com.amazonaws.dynamodb.v20120810"
:exception "ConditionalCheckFailedException"
:message "The conditional request failed"
:payload
{:TableName table
:Key #:user{:id {:N "1"} :name {:S "Ivan"}}
:ConditionExpression "#kek in (:foo, :bar, :baz)"
:ExpressionAttributeNames {"#kek" :test/kek}
:ExpressionAttributeValues
{":foo" {:N "1"} ":bar" {:N "2"} ":baz" {:N "3"}}}
:target "DeleteItem"}
The Query target allows searching items that match a primary key partially or
match some range. Imagine the primary key of a table is :user/id :HASH
and
:user/name :RANGE
. Here is what you have in the database:
{:user/id 1
:user/name "Ivan"
:test/foo 1}
{:user/id 1
:user/name "Juan"
:test/foo 2}
{:user/id 2
:user/name "Huan"
:test/foo 3}
Now, to find the items whose :user/id
is 1, execute:
(api/query CLIENT
table
{:sql-key "#id = :one"
:attr-names {"#id" :user/id}
:attr-values {":one" 1}
:limit 1})
Result:
{:Items [{:user/id 1
:test/foo 1
:user/name "Ivan"}]
:Count 1
:ScannedCount 1
:LastEvaluatedKey #:user{:id 1
:name "Ivan"}}
To propagate to the next page, fetch the LastEvaluatedKey
field from the
result and pass it into the :start-key
Query parameter.
The Scan API goes through the whole table collecting the items that match an expression. This is not optimal yet required sometimes.
(api/scan CLIENT
table
{:sql-filter "#foo = :two"
:attrs-get [:test/foo "#name"]
:attr-names {"#foo" :test/foo
"#name" :user/name}
:attr-values {":two" 2}
:limit 2})
Result:
{:Items [{:test/foo 2
:user/name "Ivan"}]
:Count 1
:ScannedCount 2
:LastEvaluatedKey #:user{:id 1
:name "Ivan"}}
Both LastEvaluatedKey
and :start-key
parameters work as described above.
See the tests, specs, and dynamodb.api
module for more information.
The api-call
function allows you to interact with DynamoDB on a low level. It
accepts the client, the target name, and a raw payload you'd like to send to
DB. The payload gets sent as-is with no kind of processing or interference.
(api/api-call CLIENT
"NotImplementedTarget"
{:ParamFoo ... :ParamBar ...})
The library provides a number of specs for the API. Find them in the
dynamodb.spec
module. It's not imported by default to prevent the binary file
from growing when compiled with GraalVM. That's a known issue when introducing
clojure.spec
adds +20 Mbs to the file.
Still, those specs are useful for testing and documentation. Import the specs,
then instrument the functions by calling the instrument
function:
(require 'dynamodb.spec)
(require '[clojure.spec.test.alpha :as spec.test])
(spec.test/instrument)
Now if you pass something wrong into one of the library functions, you'll get a spec exception.
The primary testing module called api_test.clj
relies on a local DynamoDB
instance running in Docker. To bootstrap it, execute the command:
make docker-up
It spawns amazon/dynamodb-local
image on port 8000. Now connect to the REPL
and run the API tests from your editor as usual.
Ivan Grishaev, 2023
Can you improve this documentation?Edit on GitHub
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close