Phantom Tips: Tip #2: Testing with phantom-sbt
Flavian
Flavian Scala Developer
Flavian is a Scala engineer with many years of experience and the author of phantom and morpheus.

The next post in our series on phantom is an introduction to phantom-sbt, the designated helper of choice. With this plugin, you can make your tests depend on having a real life instance of Cassandra embedded into a forked JVM ready just in time for your tests. The connectors framework in phantom is also natively compatible with this approach, and with a very simple override all you really need to do to is to change a few settings and you can have real life tests running in no time.

1. Installing the right dependencies

First, you need to add a few things to your plugins.sbt. The following resolvers need to be present in both IvyStylePatterns and normal maven style patterns. The reason for that is the different way SBT resolves plugins vs normal artefacts and also the fact that the plugin itself will need the resolver to resolve its own dependencies for non-SBT plugin artefacts.

Without further ado:

// This will allow you to download the plugin directly from Maven Central.
// It's simply instructing SBT what path to use to retrieve it from there.
// This allows us to publish SBT plugins to central and prevent the need for custom resolvers
def outworkersPattern: Patterns = {
  Patterns(
    Resolver.mavenStyleBasePattern :: Nil,
    Resolver.mavenStyleBasePattern :: Nil,
    isMavenCompatible = true
  ) } resolvers ++= Seq( // whatever is already in here.., Resolver.url( "Maven Ivy Outworkers", url(Resolver.DefaultMavenRepositoryRoot) )(outworkersPattern) ) // And finally the plugin dependency itself addSbtPlugin("com.outworkers" %% "phantom-sbt" % "2.0.1")

Now this should resolve nicely and phantom-sbt should be available in the classpath.

2. Mixing in the right settings for phantom

The first: Now there are 2 ways to go about this. For a Build.scala setup, you will need to manually import the right settings into scope and add them. You need to make sure that every sub-module that requires a running Cassandra includes the following:

settings = Defaults.coreDefaultSettings ++ .. ++ PhantomSbtPlugin.projectSettings


With the right settings included, the plugin will add a new dependency to the test phase of the project, essentially making the command sbt test depend on another internal command called startEmbeddedCassandra, which is defined inside the Phantom SBT plugin. All the above mixin does is to make the command sbt test only trigger after an embedded version of Cassandra is spawned in memory.

import com.outworkers.phantom.sbt.PhantomSbtPlugin

// then simply add the following settings to where you need Embedded Cassandra.
// You don't actually need all the dependencies we have here.
// If you're running embedded Cassandra, we are assuming it's a DB module with phantom in it.
// Just add the settings in right place.
lazy val db = Project(
    id = "db",
    base = file("db"),
    settings = Defaults.coreDefaultSettings ++
      sharedSettings ++
      PhantomSbtPlugin.projectSettings
  ).settings(
    name := "db",
    libraryDependencies ++= Seq(
      "com.outworkers" %% "phantom-dsl" % Versions.phantom
    )
  ).dependsOn(
    domain
  )


Now you are halfway there, to make the embedded approach work with your application code you need one last step, namely to make sure that tests will run using the embedded version of the database, and to guarantee this you will need a somewhat laborious injection pattern.

The second way, if you are instead using build.sbt, you can use the auto plugin features of phantom-sbt to automatically have everything you need in scope. You can still explicitly follow the approach above and add the settings only where you feel appropriate, since you may well choose to separate integration tests where embedded Cassandra is required from the rest of the tests. That kind of isolation will require explicit scoping of phantom-sbt plugin settings.

Let's assume you've created a basic database, using phantom's native Database implementation. If you have't already done this or you are not sure how, please review our introduction to phantom available here. Assuming you already have a database to work with, we can now move on to the next step.

3. Using provider traits to inject database implementations

Now scala gives us traits to play with, and the interesting one you will need today is a DatabaseProvider. The implementation of it is actually really simple:

Let's assume your database looks something like this:

import com.outworkers.phantom.dsl._

class MyDb(override val connector: KeySpaceDef) extends Database[MyDb](connector) {
  // a database contains all your table objects
  object firstTable extends FirstTable with connector.Connector
  object secondTable extends SecondTable with connector.Connector
  // etc, add as many as you need to here.
  // It's nice because whatever connector you feed in will be used for all tables,
  // and you can have multiple parallel databases that get created against different connectors.
}

// The matching provider trait looks like this:
trait MyDbProvider extends DatabaseProvider[MyDb] {
  def database: MyDb
}

// Now we can do something very very interesting.
// Just a fake sequence of potential IP addresses.


object AppConfig {
  val productionHosts = Seq("154.242.2.2", "..")
}

object ProductionDb extends MyDB(Connectors(AppConfig.productionHosts).keySpace("my_app"))

trait ProductionDatabase extends MyDbProvider {
  override def database = ProductionDb
}

// Using embedded will automatically attempt to connect to localhost:9142, the port conventionally
// used by embedded Cassandra.
object TestDb extends MyDb(Connector.embedded.keySpace("my_app_test"))

trait TestDatabaseProvider extends MyDbProvider {
   override val database = TestDb
}


4. Creating a root test suite for all your database integration tests.

The next step is to write a root trait that your tests implement, just for convenience and simplicity. Let's assume you are using ScalaTest. Phantom doesn't mandate usage of any specific testing library, although internally we generally prefer ScalaTest and our util library is also designed to work with ScalaTest.

import scala.concurrent.Await
import scala.concurrent.duration._
import org.scalatest.{Suite, Matchers, BeforeAndAfterAll, OptionValues}
import org.scalatest.concurrent.ScalaFutures

trait DatabaseTest extends Suite
  with BeforeAndAfterAll
  with ScalaFutures
  with Matchers
  with OptionValues
  with TestDatabaseProvider {
  override def beforeAll(): Unit = {
    super.beforeAll()
    // Automatically create every single table in Cassandra.
    database.create(5.seconds)
  }
}


5. Writing tests for your tables

Now all other tests can mix in the DatabaseTesTrait. The following test assumes you have a users table with a `store` and `getById` method defined.

import org.scalatest.FlatSuite
import java.util.UUID

// Let's assume the User class is: `case class User(id: UUID, name: String)`

class UserDatabaseTests extends FlatSuite with DatabaseTest {
  it should "store a user in the database and retrieve it" in {
    val user = User(UUID.randomUUID(), "test_user")
    val chain = for {
     store <- database.users.store(user)
     get <- database.users.getById(user.id)
  } yield get
  
  whenReady(chain) {
    res => 
      res shouldBe defined
      res.value shouldEqual user
  }
  
}


And there you have it, the completed testing cycle. Now you can simply repeat this pattern with all your test classes, by simply mixing in the root testing trait called DatabaseTest and auto-magically all your tests will now run against an embedded Cassandra instance that SBT will power on just in time thanks to the phantom-sbt plugin.

We hope you had fun watching this tutorial and we look forward to your feedback! For more tutorials on phantom and all our products, click the subscribe button below, we will never send anything else other than really cool tutorials!

Related articles