This library is compatible with Go 1.10+
Please refer to CHANGELOG.md
if you encounter breaking changes.
Data focused testing belongs to blackbox group, where the main interest goes down to the initial and final state of the datastore.
To set the initial state of ta datastore, this framework provies utilities to either create empty datastore, or to prepare it with dataset data to test that application works correctly.
The final state testing focuses on checking that a dataset data matches an expected set of values after application logic run. In this case this library has ability to verify either complete or snapshot state of a datastore. While the first approach will be comparing all tables data with expected set of values, the latter will reduced verification to the range provided by expected dataset.
This library has been design to provide easy and unified way of testing any datastore (SQL, NoSSQL,file logs) on any platform, language and on the cloud. It simplifies test organization by dataset auto discovery used for datastore preparation and verification. Dataset data can be loaded from various sources like: memory, local or remote csv, json files. All dataset support macro expression to dynamically evaluate value of data i.e <ds:sql ["SELECT CURRENT_DATE()"]> On top of that expected data, can also use predicate expressions to delegate verification of the data values i.e. <ds:between [11301, 11303]>. Finally a dataset like a view can be used to store data for many datastore sources in in just one dataset file.
Datastore initialization and dataset data verification can by managed locally or remotely on remote data store unit test server.
- With dedicated expected data folder
import ( "testing" "github.com/viant/dsunit" _ "github.com/go-sql-driver/mysql" ) func Test_Usecase(t *testing.T) { parent := toolbox.CallerDirectory(3) if !dsunit.InitFromURL(t, path.Join(parent, "test", "config.yaml")) { return } ... business test logic comes here expectURL := path.Join(parent, "test/case1/data/expect") expectedData := dsunit.NewDatasetResource("db1", expectURL , "", "") dsunit.Expect(t, dsunit.NewExpectRequest(dsunit.FullTableDatasetCheckPolicy, expectedData)) }
- With shared expected data folder
func Test_Usecase(t *testing.T) { parent := toolbox.CallerDirectory(3) if !dsunit.InitFromURL(t, path.Join(parent, "test", "config.yaml")) { return } ... business test logic comes here baseDir := path.Join(parent, "test", "data") dsunit.ExpectFor(t, "db1", dsunit.FullTableDatasetCheckPolicy, baseDir, "use_case_1") }
[ {},
{"id":1,"name":"name 1"},
{"id":2,"name":"name 2"}
]
When pre-seeding table with data, if the first element is empty map, dsunit deletes all record from a table before inserting supplied dataset.
[]
Empty array will with prepare method removes all record from a table.
registerResponse := service.Register(dsunit.NewRegisterRequest("db1",
&dsc.Config{
DriverName: "sqlite3",
Descriptor: "[url]",
Parameters: map[string]interface{}{
"url": filename,
},
}))
if registerResponse.Stats != "ok" {
log.Fatal(registerResponse.Error)
}
response := service.Freeze(&dsunit.FreezeRequest{
Datastore:"db1",
DestURL:"/tmp/dn1/expect/users.json",
SQL:"SELECT * FROM users",
})
Service Methods | Description | Request | Response |
---|---|---|---|
Register(t *testing.T, request *RegisterRequest) bool | register database connection | RegisterRequest | RegisterResponse |
RegisterFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | RegisterRequest | RegisterResponse |
Recreate(t *testing.T, request *RecreateRequest) bool | recreate database/datastore | RecreateRequest | RecreateResponse |
RecreateFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | RecreateRequest | RecreateResponse |
RunSQL(t *testing.T, request *RunSQLRequest) bool | run SQL commands | RunSQLRequest | RunSQLResponse |
RunSQLFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | RunSQLRequest | RunSQLResponse |
RunScript(t *testing.T, request *RunScriptRequest) bool | run SQL script | RunScriptRequest | RunSQLResponse |
RunScriptFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | RunScriptRequest | RunSQLResponse |
AddTableMapping(t *testing.T, request *MappingRequest) bool | register database table mapping (view), | MappingRequest | MappingResponse |
AddTableMappingFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | MappingRequest | MappingResponse |
Init(t *testing.T, request *InitRequest) bool | initialize datastore (register, recreate, run sql, add mapping) | InitRequest | MappingResponse |
InitFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | InitRequest | MappingResponse |
Prepare(t *testing.T, request *PrepareRequest) bool | populate databstore with provided data | PrepareRequest | MappingResponse |
PrepareFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | PrepareRequest | MappingResponse |
PrepareDatastore(t *testing.T, datastore string) bool | match to populate all data files that are in the same location as a test file, with the same test file prefix, followed by lowe camel case test name | n/a | n/a |
PrepareFor(t *testing.T, datastore string, baseDirectory string, method string) bool | match to populate all data files that are located in baseDirectory with method name | n/a | n/a |
Expect(t *testing.T, request *ExpectRequest) bool | verify databstore with provided data | ExpectRequest | MappingResponse |
ExpectFromURL(t *testing.T, URL string) bool | as above, where JSON request is fetched from URL/relative path | ExpectRequest | MappingResponse |
ExpectDatasets(t *testing.T, datastore string, checkPolicy int) bool | match to verify all data files that are in the same location as a test file, with the same test file prefix, followed by lowe camel case test name | n/a | n/a |
ExpectFor(t *testing.T, datastore string, checkPolicy int, baseDirectory string, method string) bool | match to verify all dataset files that are located in the same directory as the test file with method name | n/a | n/a |
Freeze(request *FreezeRequest) *FreezeResponse | match to verify all dataset files that are located in the same directory as the test file with method name | n/a | n/a |
Dump(request *DumpRequest) *DumpResponse | creates a database schema from existing database for supplied tables, datastore, and target Vendor | DumpRequest | DumpResponse |
Compare(request *CompareRequest) *CompareResponse | compares data based on specified SQLs from various databases | CompareRequest | CompareResponse |
This library uses assertly as the underlying validation mechanism
The macro is an expression with parameters that expands original text value. The general format of macro: <ds:MACRO_NAME [json formated array of parameters]>
The following macro are build-in:
Name | Parameters | Description | Example |
---|---|---|---|
sql | SQL expression | Returns value of SQL expression | <ds:sql["SELECT CURRENT_DATE()"]> |
seq | name of sequence/table for autoicrement | Returns value of Sequence | <ds:seq["users"]> |
Predicate allows expected value to be evaluated with actual dataset value using custom predicate logic.
Name | Parameters | Description | Example |
---|---|---|---|
between | from, to values | Evaluate actual value with between predicate | <ds:between[1.888889, 1.88889]> |
within_sec | base time, delta, optional date format | Evaluate if actual time is within delta of the base time | <ds:within_sec["now", 6, "yyyyMMdd HH:mm:ss"]> |
Most SQL drivers provide meta data about autoincrement, primary key, however if this is not available or partial verification with SQL is used, the following directive come handy.
@autoincrement@
Allows specifying autoincrement field
[
{"@autoincrement@":"id"},
{"id":1, "username":"Dudi", "active":true, "salary":12400, "comments":"abc","last_access_time": "2016-03-01 03:10:00"},
{"id":2, "username":"Rudi", "active":true, "salary":12600, "comments":"def","last_access_time": "2016-03-01 05:10:00"}
]
@indexBy@
(see also asserly indexBy directive usage, for nested data structe validation)
Allows specifying pk fields
[
{"@indexBy@":["id"]},
{"id":1, "username":"Dudi", "active":true, "salary":12400, "comments":"abc","last_access_time": "2016-03-01 03:10:00"},
{"id":2, "username":"Rudi", "active":true, "salary":12600, "comments":"def","last_access_time": "2016-03-01 05:10:00"}
]
@fromQuery@
Allows specified query to fetch actual dataset to be validated against expected dataset
users.json
[
{"@fromQuery@":"SELECT * FROM users where id <= 2 ORDER BY id"},
{"id":1, "username":"Dudi", "active":true, "salary":12400, "comments":"abc","last_access_time": "2016-03-01 03:10:00"},
{"id":2, "username":"Rudi", "active":true, "salary":12600, "comments":"def","last_access_time": "2016-03-01 05:10:00"}
]
API documentation is available in the docs
directory.
This project provide a various datasore dsunit integration examples (some with docker vi endly).
External projects::
The source code is made available under the terms of the Apache License, Version 2, as stated in the file LICENSE
.
Individual files may be made available under their own specific license, all compatible with Apache License, Version 2. Please see individual files for details.
Library Author: Adrian Witas
Contributors: Sudhakaran Dharmaraj