Skip to content

Commit

Permalink
Merge branch 'master' into optional-like
Browse files Browse the repository at this point in the history
  • Loading branch information
deusaquilus authored Nov 19, 2021
2 parents c5f3927 + 3bbade1 commit 6b1a551
Show file tree
Hide file tree
Showing 97 changed files with 1,924 additions and 1,065 deletions.
154 changes: 154 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,99 @@
# 3.11.0

- [Implement `transaction` on outer zio-jdbc-context using fiber refs](https://github.com/getquill/quill/pull/2302)
- [Feature Request: write compile-time queries to a file](https://github.com/getquill/quill/issues/1715)
- [`transaction` supports ZIO effects with mixed environments](https://github.com/getquill/quill/pull/2304)
- [Apple M1 Build Updates & Instructions](https://github.com/getquill/quill/pull/2296)

#### Migration Notes:

All ZIO JDBC context `run` methods have now switched from have switched their dependency (i.e. `R`) from `Has[Connection]` to
`Has[DataSource]`. This should clear up many innocent errors that have happened because how this `Has[Connecction]` is supposed
to be provided was unclear. As I have come to understand, nearly all DAO service patterns involve grabbing a connection from a
pooled DataSource, doing one single crud operation, and then returning the connection back to the pool. The new JDBC ZIO context
memorialize this pattern.

* The signature of `QIO[T]` has been changed from `ZIO[Has[Connection], SQLException, T]` to `ZIO[Has[DataSource], SQLException, T]`.
a new type-alias `QCIO[T]` (lit. Quill Connection IO) has been introduced that represents `ZIO[Has[Connection], SQLException, T]`.

* If you are using the `.onDataSource` command, migration should be fairly easy. Whereas previously, a usage of quill-jdbc-zio 3.10.0
might have looked like this:
```scala
object MyPostgresContext extends PostgresZioJdbcContext(Literal); import MyPostgresContext._
val zioDS = DataSourceLayer.fromPrefix("testPostgresDB")

val people = quote {
query[Person].filter(p => p.name == "Alex")
}

MyPostgresContext.run(people).onDataSource
.tap(result => putStrLn(result.toString))
.provideCustomLayer(zioDs)
```
In 3.11.0 simply remove the `.onDataSource` in order to use the new context.
```scala
object MyPostgresContext extends PostgresZioJdbcContext(Literal); import MyPostgresContext._
val zioDS = DataSourceLayer.fromPrefix("testPostgresDB")

val people = quote {
query[Person].filter(p => p.name == "Alex")
}

MyPostgresContext.run(people) // Don't need `.onDataSource` anymore
.tap(result => putStrLn(result.toString))
.provideCustomLayer(zioDs)
```

* If you are creating a Hikari DataSource directly, passing of the dependency is now also simpler. Instead having to pass
the Hikari-pool-layer into `DataSourceLayer`, just provide the Hikari-pool-layer directly.

From this:
```scala
def hikariConfig = new HikariConfig(JdbcContextConfig(LoadConfig("testPostgresDB")).configProperties)
def hikariDataSource: DataSource with Closeable = new HikariDataSource(hikariConfig)

val zioConn: ZLayer[Any, Throwable, Has[Connection]] =
Task(hikariDataSource).toLayer >>> DataSourceLayer.live


MyPostgresContext.run(people)
.tap(result => putStrLn(result.toString))
.provideCustomLayer(zioConn)
```
To this:
```scala
def hikariConfig = new HikariConfig(JdbcContextConfig(LoadConfig("testPostgresDB")).configProperties)
def hikariDataSource: DataSource with Closeable = new HikariDataSource(hikariConfig)

val zioDS: ZLayer[Any, Throwable, Has[DataSource]] =
Task(hikariDataSource).toLayer // Don't need `>>> DataSourceLayer.live` anymore!

MyPostgresContext.run(people)
.tap(result => putStrLn(result.toString))
.provideCustomLayer(zioConn)
```

* If you want to provide a `java.sql.Connection` to a ZIO context directly, you can still do it using the `underlying` variable.
```
object Ctx extends PostgresZioJdbcContext(Literal); import MyPostgresContext._
Ctx.underlying.run(qr1)
.provide(zio.Has(conn: java.sql.Connection))
```

* Also, when using an underlying context, you can still use `onDataSource` to go from a `Has[Connection]` dependency
back to a `Has[DataSource]` dependency (note that it no longer has to be `with Closable`).
```
object Ctx extends PostgresZioJdbcContext(Literal); import MyPostgresContext._
Ctx.underlying.run(qr1)
.onDataSource
.provide(zio.Has(ds: java.sql.DataSource))
```

* Finally, that the `prepare` methods have been unaffected by this change. They still require a `Has[Connection]`
and have the signature `ZIO[Has[Connection], SQLException, PreparedStatement]`. This is because in order to work
with the result of this value (i.e. to work with `PreparedStatement`), the connection that created it must
still be open.

# 3.10.0

- [Defunct AsyncZioCache accidentally returned in #2174. Remove it.](https://github.com/getquill/quill/pull/2246)
Expand All @@ -15,6 +111,64 @@ relevant information. ProtoQuill uses them in order to pass Ast information as w
the query is Static or Dynamic into execute and prepare methods. In the future, Scala2-Quill may be enhanced
to use them as well.

# 3.10.1

#### Migration Notes:

Datastax standard configuration file format and properties (HOCON) are used to provide cassandra driver configuration

Sample HOCON:
```hocon
MyCassandraDb {
preparedStatementCacheSize=1000
keyspace=quill_test

session {
basic.contact-points = [ ${?CASSANDRA_CONTACT_POINT_0}, ${?CASSANDRA_CONTACT_POINT_1} ]
basic.load-balancing-policy.local-datacenter = ${?CASSANDRA_DC}
basic.request.consistency = LOCAL_QUORUM
basic.request.page-size = 3
}

}
```

The `session` entry values and keys are described in the datastax documentation:
[Reference configuration](https://docs.datastax.com/en/developer/java-driver/4.13/manual/core/configuration/reference/)


The ZioCassandraSession constructors:

```scala
val zioSessionLayer: ZLayer[Any, Throwable, Has[CassandraZioSession]] =
CassandraZioSession.fromPrefix("MyCassandraDb")
run(query[Person])
.provideCustomLayer(zioSessionLayer)
```

Additional parameters can be added programmatically:
```scala
val zioSessionLayer: ZLayer[Any, Throwable, Has[CassandraZioSession]] =
CassandraZioSession.fromContextConfig(LoadConfig("MyCassandraDb").withValue("keyspace", ConfigValueFactory.fromAnyRef("data")))
run(query[Person])
.provideCustomLayer(zioSessionLayer)
```


`session.queryOptions.fetchSize=N` config entry should be replaced by
`basic.request.page-size=N`

```hocon
testStreamDB {
preparedStatementCacheSize=1000
keyspace=quill_test
session {
...
basic.request.page-size = 3
}
```

# 3.9.0

- [Pass Session to all Encoders/Decoders allowing UDT Encoding without local session varaible in contexts e.g. ZIO and others](https://github.com/getquill/quill/pull/2219)
Expand Down
128 changes: 95 additions & 33 deletions CODEGEN.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ You can import the Code Generator using maven:
````xml
<dependency>
<groupId>io.getquill</groupId>
<artifactId>quill-codegen-jdbc_2.11</artifactId>
<version>3.1.1-SNAPSHOT</version>
<artifactId>quill-codegen-jdbc_2.13</artifactId>
<version>3.10.0</version>
</dependency>
````

Or using sbt:
````scala
libraryDependencies += "io.getquill" %% "quill-codegen-jdbc" % "3.1.1-SNAPSHOT"
libraryDependencies += "io.getquill" %% "quill-codegen-jdbc" % "3.10.0"
````


Expand Down Expand Up @@ -56,10 +56,31 @@ create table public.Address (
You can invoke the SimpleJdbcCodegen like so:

````scala
// provide DB credentials with a com.typesafe.config.Config object
// (under the hood the credentials are used to create a HikariPool DataSource)
import io.getquill.codegen.jdbc.SimpleJdbcCodegen
import io.getquill.util.LoadConfig

val snakecaseConfig = LoadConfig(configPrefix: String)
val gen = new SimpleJdbcCodegen(snakecaseConfig, "com.my.project") {
override def nameParser = SnakeCaseNames
}
gen.writeFiles("src/main/scala/com/my/project")

// or, provide an initialized DataSource
import io.getquill.codegen.jdbc.SimpleJdbcCodegen
import org.postgresql.ds.PGSimpleDataSource

val pgDataSource = new PGSimpleDataSource()
pgDataSource.setURL(
"jdbc:postgresql://127.0.0.1:5432/quill_codegen_example?ssl=false",
)
pgDataSource.setUser("my_user")
pgDataSource.setPassword("my_password")
val gen = new SimpleJdbcCodegen(pgDataSource, "com.my.project") {
override def nameParser = SnakeCaseNames
}
gen.writeFiles("src/main/scala/com/my/project")
````

You can parse column and table names using either the `SnakeCaseNames` or the and the `LiteralNames` parser
Expand All @@ -86,8 +107,10 @@ in order to generate your schemas with `querySchema` objects.

## Composeable Traits Codegen

The `ComposeableTraitsJdbcCodegen` allows you to customize table/column names in entity case classes
and generates the necessary `querySchema` object in order to map the fields.
The `ComposeableTraitsJdbcCodegen` enables more customized code generation.
It allows you to determine the tables to generate entity classes for,
their naming stragety, the types for columns in Scala,
and generates the necessary `querySchema` object in order to map the fields.
Additionally, it generates a database-independent query schema trait which can be composed
with a `Context` object of your choice.

Expand All @@ -111,49 +134,61 @@ Here is a example of how you could use the `ComposeableTraitsJdbcCodegen` in ord
`first_name` and `last_name` properties with `first` and `last`.

````scala
val gen = new ComposeableTraitsJdbcCodegen(snakecaseConfig,"com.my.project") {
override def namingStrategy: EntityNamingStrategy =
CustomStrategy(
col => col.columnName.toLowerCase.replace("_name", "")
)
val gen = new ComposeableTraitsJdbcCodegen(
configOrDataSource,
packagePrefix = "com.my.project",
nestedTrait = true) {

override def nameParser: NameParser = CustomNames(
columnParser = col => col.columnName.toLowerCase.replace("_name", "")
)


override def packagingStrategy: PackagingStrategy = PackagingStrategy.ByPackageHeader.TablePerSchema(packagePrefix)
}
gen.writeFiles("src/main/scala/com/my/project")
````


The following schema should be generated as a result.
````scala
package com.my.project.public

case class Address(person_fk: Int, street: Option[String], zip: Option[Int])
case class Person(id: Int, first: Option[String], last: Option[String], age: Int)

case class Address(person_fk: Int, street: Option[String], zip: Option[Int])

// Note that by default this is formatted as "${namespace}Extensions"
trait PublicExtensions[Idiom <: io.getquill.idiom.Idiom, Naming <: io.getquill.NamingStrategy] {
this:io.getquill.context.Context[Idiom, Naming] =>

object AddressDao {
def query = quote {
querySchema[Address](
"PUBLIC.ADDRESS",
_.person_fk -> "PERSON_FK",
_.street -> "STREET",
_.zip -> "ZIP"
)
}
}

object PublicSchema {
object PersonDao {
def query = quote {
querySchema[Person](
"PUBLIC.PERSON",
_.id -> "ID",
_.first -> "FIRST_NAME",
_.last -> "LAST_NAME",
_.age -> "AGE"
)
}
}
def query = quote {
querySchema[Person](
"public.person",
_.id -> "id",
_.first -> "first_name",
_.last -> "last_name",
_.age -> "age"
)

}

}

object AddressDao {
def query = quote {
querySchema[Address](
"public.address",
_.person_fk -> "person_fk",
_.street -> "street",
_.zip -> "zip"
)

}

}
}
}
````

Expand All @@ -164,6 +199,33 @@ object MyCustomContext extends SqlMirrorContext[H2Dialect, Literal](H2Dialect, L
with PublicExtensions[H2Dialect, Literal]
````

`ComposeableTraitsJdbcCodegen` is designed to be customizable via composition. This is a longer list of customizable strategies:

```scala
import io.getquill.codegen.jdbc.ComposeableTraitsJdbcCodegen
import io.getquill.codegen.model._

new ComposeableTraitsJdbcCodegen(...) {

// whether to generate Scala code for a table
override def filter(tc: RawSchema[JdbcTableMeta, JdbcColumnMeta]): Boolean = ???

// how to name table / columns in Scala
override def nameParser: NameParser = ???

// how to organize generated code into files / packages
override def packagingStrategy: PackagingStrategy = ???

// what JVM types (classes) to use for DB column
// e.g. one may want to translate Postgres `timestamptz` to java.time.OffsetDateTime
override def typer: Typer = ???

// what to do when `typer` above cannot find an appropriate type and returned None
override def unrecognizedTypeStrategy: UnrecognizedTypeStrategy = ???
}


```

## Stereotyping
Frequently in corporate databases, the same kind of table is duplicated across multiple schemas, databases, etc...
Expand Down
23 changes: 4 additions & 19 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,25 +128,6 @@ The steps to do that are the following:
Once you have resolved any conflicts that may have arisen from the rebase, your
branch will be capable of becoming a pull-request.

## Oracle Support

By default, the sbt build will not run or even compile the Oracle test suites, this is because
Oracle JDBC drivers are not available in any public repository. If you wish to test with the built-in
Oracle 18c XE Docker container using the Oracle 18c XE JDBC drivers, you can extract them from
the container and load them into your local maven repo using the `load_jdbc.sh` script.
Note that this is only allowed for development and testing purposes!

Use the `-Doracle` argument to activate compilation and testing of the Oracle test suites.

```bash
# Load oracle jdbc drivers
> ./build/oracle_test/load_jdbc.sh
...

# Specify the -Doracle argument *before* the build phases that will run Oracle tests
> sbt -Doracle clean test
```

## Building locally using Docker only for databases

To restart your database service with database ports exposed to your host machine run:
Expand All @@ -160,10 +141,14 @@ After that we need to set some environment variables in order to run `sbt` local
```bash
export CASSANDRA_HOST=127.0.0.1
export CASSANDRA_PORT=19042
export CASSANDRA_CONTACT_POINT_0=127.0.0.1:19042
export CASSANDRA_DC=datacenter1
export MYSQL_HOST=127.0.0.1
export MYSQL_PORT=13306
export MYSQL_PASSWORD=root
export POSTGRES_HOST=127.0.0.1
export POSTGRES_PORT=15432
export POSTGRES_PASSWORD=postgres
export SQL_SERVER_HOST=127.0.0.1
export SQL_SERVER_PORT=11433
export ORIENTDB_HOST=127.0.0.1
Expand Down
Loading

0 comments on commit 6b1a551

Please sign in to comment.