Author:
Tim Chandler
Published on 2021-03-09

Fast Test-Databases

This book describes techniques used to get the best speeds from test-databases.

The concepts are described with the PHP framework Laravel in mind, but the concepts aren't specific to Laravel.

Preface

Testing code is important, and it can be frustrating when tests run slowly. One of the main causes is the time spent building test-databases.

Laravel has many tools available out-of-the-box such as the RefreshDatabase and DatabaseMigrations traits, squashing migrations, and their integration with ParaTest. In fact, as a framework, Laravel makes it very easy to test your code.

I explain these as well as other concepts that can be used to speed up test-databases, such as how browser tests with databases can be run in parallel.

The examples in this book relate to Laravel, running PHPUnit based tests, and use MySQL - unless specified otherwise.


Table of Contents


Chapter 1 - Introduction

As a brief introduction; Testing is a way for you to ensure your project works as expected. When you write code, write tests to go along with it. Down the track, if a change breaks a test, you'll know that there's a problem and where to look to fix it.

Having good test code-coverage is important because you and others can be more confident when making changes to your codebase. Which is useful when adding new features and re-factoring, as well as keeping your 3rd-party dependencies up-to-date.

There are different types of tests. Tests can be:

  • low level - called unit tests - which test small pieces of code,
  • higher level tests such as feature and integration tests - which check that different parts of your code work well together, and
  • browser tests - which interact with your code the same way a user would.

There are various other names for tests, like feature tests, functional tests, acceptance tests, confidence tests, contract tests and end-to-end tests. (Some are just different names for the same thing).

Each type of test focuses on the code differently. There are varied opinions on what each type of test should do and not do.

Test Driven Development (TDD) is a development process you may wish to use, where tests are continuously built alongside the code they test.

Jason McCreary has a useful guide on where to start when testing and Laracasts has a section of video tutorials about testing as well.

Why speed up testing?

It's important your tests don't run slowly. The more waiting you do when running tests, the less likely you are to want to write them. This creates a distraction and can become a critical barrier to building tests.

A slow test suite will also slow down deployment times when using a CI/CD pipeline.

Maybe it's not too bad. But either way, all things being equal, faster tests are better.

Non-database speed improvements

Whilst this book focuses on test-databases, it's important to note there are other ways to speed up tests. Here are a few to give you some ideas:

  • Don't use the Laravel App when you don't need to: If your tests extend from Tests\TestCase which uses the CreatesApplication trait, a new Laravel App is built for each test which takes a bit of time. This includes initialising the service-providers, detecting routes, and more.

    If you don't need Laravel's functionality in a test, extend from PHPUnit\Framework\TestCase instead.

  • Turn Xdebug off: Xdebug is a useful development tool however it will slow down your tests. If you'd like code-coverage reports, use PCov instead (support was added in PHPUnit 8).

  • Use Mocks: Mocks are a way of substituting code with pretend code. This can save time when you mock slow code, like an interface that accesses 3rd-party services.

    You should frame your decision to use mocks around how you'd like your tests to focus on your code. That is, if you mock something (or don't), what are your tests actually testing? If mocking aligns with your testing strategy, great.

  • Use a low number of password hash rounds: Password hashing is slow (by design), especially if you create lots of them in your seeders. You could also hard-code password values to avoid the hashing altogether.

  • Run specific tests: When working on a particular piece of code, you can run specific tests by adding the --filter=xxx option as you run your tests. Class names and test-methods can be searched for this way.

Tim MacDonald describes some of these and others in more detail.

The problem when working with test-databases

Over time, you may end up with hundreds or possibly thousands of tests, depending on the project. If it takes even a fraction of a second to prepare the database freshly for each test, that's a lot of extra time overall. Luckily there are several things that can improve this part of the process.

The problem that needs to be solved is that the database needs to be in a "clean" state for each test. It's content must be known, so the tests can be deterministic. Values that bleed between tests may make them unstable.

Aside: I've read about test environments where tests try to clean-up after themselves, or simply share a database. I would avoid this altogether as it can lead to headaches due to obscure dependencies between tests, and errors that are hard to reproduce.

Unstable tests cannot be a trade-off for increased speed.

Laravel's database-related testing features

If you want to jump straight in and test your Laravel project, here's a summary of its database-related testing features:

  • The RefreshDatabase trait: You probably already use this one. It will run your migrations (but not seeders) on the "default" connection's database before the first test, and wrap each test in a transaction that's rolled back afterwards.

    If you're using a persistent database (like MySQL), you'll need to be careful not to accidentally commit the transaction within your tests, and your code itself can't use transactions.

  • The DatabaseTransactions trait: This can be used to wrap each test in a transaction. Unlike RefreshDatabase, it doesn't build the database first.

    It can be useful when running parallel tests if you'd like to use a single database - provided you build the database yourself before-hand.

  • The DatabaseMigrations trait: This is Laravel's recommended way of building databases when browser testing. It migrates the "default" connection's database fresh for every test.

  • Run tests in parallel: Laravel 8.25 introduced support for running (non-browser) tests in parallel. This also lets you re-use databases between tests. After making changes to your database schema, you'll need to trigger their rebuild manually.

  • Seeding: (When parallel testing) you can tell Laravel to perform steps like seeding when initially setting up your database.

  • Change database type: You can specify a different type of database for your tests to use (e.g. SQLite) via your .env.testing and config/database.php files.

  • "Squash" your migrations into a schema file: If you have lots of migration files, you may want to "squash" your migrations into a schema sql file. This file replaces your current migrations and is imported when migrating fresh, before any new migrations are run.

Read on for more detailed explanations of these, as well as others…


Chapter 2 - Don't use a database when you don't need one

Not all tests need a database, so skip the database altogether when it's not needed!

This is the easiest optimisation to make, and it might seem obvious. But it's worth mentioning so it's not overlooked.

<?php
// tests/Unit/MyTest.php

namespace Tests\Unit;

use App\Support\Calculator;
//use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;

class MyTest extends TestCase
{
//  
use RefreshDatabase;
// not needed here
public function test_addition() { $this->assertSame(2, Calculator::add(1, 1)); } }

This example could probably also extend from PHPUnit\Framework\TestCase instead of Laravel's Tests\TestCase because it doesn't look like it uses any of Laravel's features.


Chapter 3 - Insert in bulk

The following chapters contain techniques that can be used when building your test-databases, populating, and using them.

To start with, an area that might give you speed improvements is how your database is populated.

When your seeders create large sets of data, it pays to have a look at how that data is inserted into the database.

Rows that are inserted individually will run slower than those combined into a single query. Combining the inserts will reduce the number of instructions sent to the database, which saves time:

# slower
INSERT INTO users ('name', 'email') VALUES ('bob', 'bob@example.com');
INSERT INTO users ('name', 'email') VALUES ('jane', 'jane@example.com');
…
# faster
INSERT INTO users ('name', 'email') VALUES 
    ('bob', 'bob@example.com'), ('jane', 'jane@example.com'), …;

Laravel factories can be used in this way too:

// instead of inserting rows in separate queries
User::factory()->count(1000)->create();
// insert all the rows in a single query
User::insert(
    User::factory()->count(1000)->make()->toArray()
);

When inserting models along with relationships to other models, inserts may still need to be kept separate.

There's a limit on how many models you can save in one query. It depends on how many and what type of fields they have, as well as some database run-time limitations. If you find that these large insert queries fail, you could look at chunking your data before saving:

// insert the rows in chunks
$users = User::factory()->count(100000)->make()->toArray();
foreach ($users->chunk(1000) as $userChunk) {
    User::insert($userChunk);
}
// or to save php memory
foreach (range(1, 100) as $i) {
    User::insert(
        User::factory()->count(1000)->make()->toArray()
    );
}

You can take this concept a step further by not using factories, as Povilas Korop describes. This will give you an extra speed boost, but you loose the benefits they give you.

Something that might also help improve the speed of a seeder's inserts is to temporarily disable foreign-key constraints (remember to enable it afterwards). This will disable the extra checks MySQL does when you have foreign keys on fields in your tables:

# MySQL
SET FOREIGN_KEY_CHECKS=0;
# insert data…
SET FOREIGN_KEY_CHECKS=1;

Chapter 4 - Use a pre-built database

If you have lots of migrations that have built up over time, there's a good chance they update the same tables over-and-over (adding and tweaking fields etc). Running these migrations can become inefficient.

Many database servers let you create sql dump files, backups that can be imported again later. These are essentially snapshots of a database at a point in time.

MySQL's mysqldump can be used to generate these. Importing one of these dump files is often quicker than migrating your database from scratch. Sometimes by a little, sometimes by a lot. It depends on how inefficient your migrations have become.

# MySQL export
mysqldump --add-drop-table --skip-lock-tables test_database > test_database.snapshot.sql
# import
mysql test_database < test_database.snapshot.sql

If your tests use Laravel's DatabaseTransactions trait (which won't try to build the database), you could import a snapshot before running your tests:

<?php
// tests/Feature/MyTest.php

namespace Tests\Feature;

use Illuminate\Foundation\Testing\DatabaseTransactions;
use Tests\TestCase;

class MyTest extends TestCase
{
    
use DatabaseTransactions;
… }
# import the pre-built database
mysql test_database < test_database.snapshot.sql
# run the tests php artisan test

Note: If you add new migrations, you'll need to re-create your sql dump file. Or alternatively, you could run these new migrations after importing the sql snapshot.

# import the pre-built database and migrate
mysql test_database < test_database.snapshot.sql
php artisan migrate
# run the tests php artisan test

Laravel incorporates this concept into its migration process, by letting you "squash" - i.e. replace existing migrations with a sql dump file. This dump will be imported when migrating, before any newer migrations are run.

php artisan schema:dump --prune

This can be even faster when using SQLite, as its databases are actually files. Instead of creating a sql dump file, these can simply be copied which is very quick! (SQLite memory databases cannot however).

# SQLite export
cp -p test-database.sqlite test-database.snapshot.sqlite
# import
cp -p test-database.snapshot.sqlite test-database.sqlite

If you have seeders that take a while to run, you could consider including them in your snapshot as well. This is worth considering when you have several tests that use the same seeders.


Chapter 5 - Wrap tests in transactions

Database transactions are used to isolate changes in a database from other processes. Changes within a transaction won't appear to others until they are committed. This is normally used when several changes need to be atomic. That is, they need appear to happen at the same time. Such as when transferring money from one account to another.

If changes need to be cancelled, they can be rolled-back instead of being committed. When this happens, the changes disappear and no other processes get to see them. This becomes a really useful tool when testing.

When you've built your database; you can start a transaction before running a test, and roll it back afterwards. This will leave the database clean - as though your test never ran.

Although there are some circumstances where this can't be used, this method gives the best gains from any of the methods in this book.

# start a transaction
DB::beginTransaction();
# run a test# roll-back the transaction
DB::rollBack();
# repeat for the next test

Useful to note: In MySQL, auto-increment ids aren't reset when a transaction is rolled back. They keep increasing between transactions. This isn't serious, but you should be aware, so you don't write your tests to rely on particular ids.

Laravel lets you wrap your tests in a transaction like this by adding the RefreshDatabase trait to your test classes. The database your "default" connection points to will be built (your migrations will be run, but not your seeders), and each test is then run within a transaction.

<?php
// tests/Feature/MyTest.php

namespace Tests\Feature;

use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;

class MyTest extends TestCase
{
    
use RefreshDatabase;
… }

Adam Wathan proposed the idea behind RefreshDatabase, and describes it in this video.

Checking that transactions weren't committed

It's important to note that while transactions are very useful when used this way, they can't always be used. You need to make sure they aren't committed, as changes will be written to the database:

  • If a second transaction starts while one is already active, the first will be implicitly committed. For this reason you won't be able to use this method if the code you're testing uses its own transactions.

  • It's also possible to accidentally commit a transaction. There are several things that implicitly commit transactions, such as when you truncate or alter a MySQL table.

When using Laravel's RefreshDatabase trait, this may cause a PDOException "There is no active transaction".

If the transaction was committed, the database needs to be rebuilt to get it back to a known state. The best thing to do is to check afterwards.

It's possible to detect by checking that a transaction is still active after the test has finished. However, it's probably safer to plant a value in a meta-data table that you know won't exist after the transaction is rolled back.

Do this by starting your transaction and then inserting the value. If the transaction is committed, this value will persist. You can check to make sure it doesn't exist before each test starts.

Another situation where a transaction-wrapper can't be used is browser tests. This is because two processes need to access the same data, however only one would see the data hidden inside the transaction.

In situations where tests can't be wrapped in a transaction, you could look at other techniques like importing a snapshot dump file instead.

Example Code: Detect committed transactions

Let's create our own code to illustrate this process.

We'll build a class DatabaseBuilder which manages the database set-up, and a trait PrepareDatabase that integrates it into our MyTest test.

Just like in Laravel, a useful way to inject functionality into a test is by using a trait. Our trait PrepareDatabase will bootload the code that manages the database:

<?php
// tests/Feature/PrepareDatabase.php

namespace Tests\Feature;

trait PrepareDatabase
{
    /**
     * @before
     */
    public function boot(): void
    {
        $this->afterApplicationCreated(
            fn() => $this->prepareConnection()
        );
    }

    private function prepareConnection(): void
    {
        $connection = config('database.default');
        $builder = new DatabaseBuilder($connection);

        $builder->start();
        $this->beforeApplicationDestroyed(
            fn() => $builder->finish()
        );
    }
}

The @before docblock annotation tells PHPUnit to run that method before each test runs.

The afterApplicationCreated() and beforeApplicationDestroyed() methods tell Laravel to run the given closures just after it's App has been initialised, and just before it's destroyed, respectively.

The DatabaseBuilder class that's used above is what does the work. To start with in our example, it will:

  • build the database when the test-run starts,
  • wrap each test in a transaction, and
  • check that the transaction rolled-back successfully each time.

(Please excuse the long example code, it covers a lot of ground!)

<?php
// tests/Feature/DatabaseBuilder.php

namespace Tests\Feature;

use Artisan;
use DB;
use Illuminate\Database\ConnectionInterface;
use PDO;
use stdClass;
use Throwable;

class DatabaseBuilder
{
    private string $connection;
    private string $database;
    private ?PDO $pdo = null;

    public function __construct(string $connection)
    {
        $this->connection = $connection;
    }

    public function start(): void
    {
        $this->chooseDB();
        $this->prepareDB();
        
$this->startTransaction();
} public function finish(): void {
$this->rollBackTransaction();
} private function chooseDB(): void { $key = "database.connections.$this->connection.database"; $this->database = config($key); } private function prepareDB(): void { if ($this->dbIsReusable()) { dump("REUSING the database: $this->database"); } else { dump("REBUILDING the database: $this->database"); $this->rebuildDB(); } } private function dbIsReusable(): bool { $row = $this->loadReuseRow(); if (!$row) { return false; }
if ($row->txn_committed) {
return false; } return true; } private function loadReuseRow(): ?stdClass { $row = null; try { $row = $this->directDB()->query( "SELECT * FROM `$this->database`.reuse_check LIMIT 0, 1", PDO::FETCH_CLASS, 'stdClass' )->fetchObject(); } catch (Throwable $e) { // the database might not exist yet } return $row ?? null; } private function rebuildDB(): void { $this->createDB(); $this->migrate(); $this->createMetaDataTable(); } private function createDB(): void { // always start fresh $this->directDB()->exec( "DROP DATABASE IF EXISTS `$this->database`" ); $this->directDB()->exec( "CREATE DATABASE `$this->database`" ); } private function migrate(): void { Artisan::call('migrate:fresh'); } private function createMetaDataTable(): void { $this->laravelDB()->statement( "CREATE TABLE reuse_check ("
. "txn_committed TINYINT "
. ")" ); $this->laravelDB()->insert( "INSERT INTO reuse_check ("
. "txn_committed "
. ") " . "VALUES (
0
)"
); } private function
startTransaction():
void
{ $this->laravelDB()->beginTransaction(); // this value will persist if committed $this->laravelDB()->insert(
"UPDATE reuse_check SET txn_committed = 1"
); } private function
rollBackTransaction():
void
{ $this->laravelDB()->rollBack(); } private function laravelDB(): ConnectionInterface { return DB::connection($this->connection); } private function directDB(): PDO { if (!is_null($this->pdo)) { return $this->pdo; } // connect to the database directly // without choosing a database yet $config = config( "database.connections.$this->connection" ); $driver = $config['driver']; // note only mysql is supported here $dsn = sprintf( "$driver:host=%s;port=%d", $config['host'], $config['port'] ); // the connection is closed when this object is destroyed return $this->pdo = new PDO( $dsn, $config['username'], $config['password'] ); } }

And MyTest is a test that needs a database:

<?php
// tests/Feature/MyTest.php

namespace Tests\Feature;

use Tests\TestCase;

class MyTest extends TestCase
{
    
use PrepareDatabase;
public function testSomething1() { $this->assertTrue(true); // "accidentally" commit the transaction // \DB::commit(); } public function testSomething2() { $this->assertTrue(true); } }

When the database is reused like this, the 2nd test onwards skips the building phase:

php artisan test
# when the database is 
REBUILT each time
"REBUILDING the database: test_database" "REBUILDING the database: test_database" PASS Tests\Feature\MyTest something1 something2 Tests: 2 passed Time: 0.89s
# when the database is 
REUSED for the 2nd test
"REBUILDING the database: test_database"
"REUSING the database: test_database"
PASS Tests\Feature\MyTest something1 something2 Tests: 2 passed Time: 0.51s

Chapter 6 - Re-using test-databases between test-runs

If a database is left-over from a previous test-run, it can potentially be used again straight away the next time you run your tests. In much the same way a database is reused within the same test-run.

This will save you a lot of time when you're working on a piece of code and need to run the same test/s over-and-over, as you won't have that initial building pause.

For the same reasons as using transactions, this method won't work with browser tests, or code that commits transactions.

Laravel supports database re-use via its new parallel testing feature. It automatically builds copies of the "default" connection's database, and leaves them there, ready for next time:

# run the tests, and reuse the databases if they exist
php artisan test --parallel

Note: You will need to trigger a re-build of the database manually when your migrations change.

# run the tests, and force a rebuild of the databases
php artisan test --parallel --recreate-databases

There's more about parallel testing later on in this book.

Making sure databases are safe to re-use between test-runs

Just like when wrapping tests in transactions, the database needs to be checked to ensure it's safe to re-use between test-runs.

Previous tests must have successfully run inside a transaction. This is described in more depth in the previous chapter about running tests inside transactions.

This time, we also need to be sure that the database was built the same way as it would be this time. Updating your migrations (or seeders + factories, depending on when you start your transaction) would cause this to change, for example. And when it does, the database needs to be rebuilt.

I describe a way of determining how the database is built in the next chapter. Here is how our output might look when the database can be re-used straight away:

php artisan test
# when the database is 
REUSED straight away
"REUSING the database: test_database"
"REUSING the database: test_database" PASS Tests\Feature\MyTest something1 something2 Tests: 2 passed Time: 0.11s

There are some other considerations about this approach:

  • To ensure integrity, changes can't be made to the left-over database outside test-runs,

  • You'll have an extra database sitting around taking up space.

I can't imagine either of these would be an issue.


Chapter 7 - Using multiple "scenario" test-databases

A useful technique is to create separate databases for different "scenarios". These let us isolate databases with different migrations, and seeders etc.

This involves determining what the current test's "scenario" is, and building a database with a name that's unique for it.

A simple example might have databases named something like: test_scenario_1, test_scenario_2, test_scenario_3.

In reality, it's better to create a hash based on certain criteria, and use that in the database name. By doing this, each scenario would generate a different hash, and so a different database.

I'm going to break a scenario down in to two parts, each with their own hash: a "build" and a "situation"…

Determining the "build"

A "build" means any tools that could be used to build a database, regardless of whether they're actually used in a particular situation or not. These would include:

  • all possible pre-migration import files (i.e. sql dump files that can be imported before migrating),
  • all possible migrations,
  • all possible seeders,
  • all possible factories.

The "build" hash will be based on these files. If you add a migration or change a seeder, this hash would change, and your previous test-databases will be ignored.

To generate the hash, you could look at the files' modified times. However, a safer way would be to look at their paths and content.

The following code generates a md5 hash based on the files in a set of directories:

use Illuminate\Support\Collection;
use RecursiveDirectoryIterator;
use RecursiveIteratorIterator;

function determineBuildHash(): string
{
    $preMigrationDir = database_path('pre_migration_imports');
    $migrationDir = database_path('migrations');
    $seederDir = database_path('seeders');
    $factoryDir = database_path('factories');

    $fileHashes = allFiles([
        $preMigrationDir,
        $migrationDir,
        $seederDir,
        $factoryDir
    ])
        ->mapWithKeys(fn($path) => [$path => md5_file($path)])
        ->all();
    return 
md5(serialize($fileHashes));
} function allFiles(array $dirs): Collection { return collect($dirs) ->map(fn(string $dir) => filesInDir($dir)) ->flatten(1) ->unique() ->sort(); } function filesInDir(string $dir): Collection { $dirIterator = new RecursiveDirectoryIterator($dir); $fileIterator = new RecursiveIteratorIterator($dirIterator); $files = []; foreach ($fileIterator as $file) { $files[] = is_file($file->getPathname()) ? $file->getPathname() : null; } return collect($files)->filter()->sort(); }

You could use the "build" hash to write a process that detects old databases (that don't match it anymore), and remove them.

Choosing a "situation"

A "situation" is made up of the things that build this particular situation, keeping it separate from other situations. These would include:

  • the particular pre-migration import files used (i.e. if you use a sql dump file before migrating),
  • the particular migrations used (in the case that you have different sets of migrations that can be run),
  • the particular seeders used,
  • whether the tests are run in a transaction or not.

Let's say you have a test that needs to run some expensive seeders that insert a lot of data. You could include these seeders' details in your situation, and you would get a database.

You might also have some other tests that use different seeders. This would become a different situation, and populate another database to co-exist with the first.

As you run your test suite, these two scenarios won't compete to use the same actual database.

As the content of the seeders and factories etc are already included in the build hash, you can simply include the list of files used when generating a situation hash:

function determineSituationHash(): string
{
    $preMigrationImports = database_path(
        'pre_migration_imports/snapshot.sql'
    );
    $migrationsDir = database_path('migrations');
    $seeders = ['DatabaseSeeder'];
    $canReuseDatabase = true;

    return 
md5(serialize([
$preMigrationImports, $migrationsDir, $seeders, $canReuseDatabase ])); }

Using a scenario

The build and situation hashes can then be used to generate a unique database name. This name serves as a way of understanding what's in the database, which is needed when reusing databases between test-runs.

function chooseDB(): string
{
    // taking a substr of the hash helps
    // keep the length manageable,
    // while still being quite unique
    $build = substr(determineBuildHash(), 0, 12);
    $situation = substr(determineSituationHash(), 0, 12);
    
return "test_$build_$situation";
}

Once you have a process to determine which database name to use, you need to tell the framework to use it.

Although Laravel's config settings are normally defined once via config files (and a .env or .env.testing file), you can actually alter config values on-the-fly.

This means that once your code chooses a name for your database, you can tell Laravel to use it.

$connection = config('database.default');
$database = chooseDB();
config(["database.connections.$connection.database" => $database]);

You could potentially repeat this process to set up multiple databases at the same time. Meaning one test could access 2 or more database connections if that's what your codebase does.

Note: Laravel's parallel testing feature creates databases for you. But if you're not using it, your code will need to create the databases before use.

Example Code: Putting scenarios into practice

Adding to our earlier example which detects if a test's wrapper-transaction has been committed… this time, we're going to check that the database is safe to re-use straight away.

First, lets update MyTest to indicate which $seeders should be run automatically, and indicate if it uses transactions itself in $usesOwnTxn:

<?php
// tests/Feature/MyTest.php

namespace Tests\Feature;

use Tests\TestCase;

class MyTest extends TestCase
{
    use PrepareDatabase;

    
protected array $seeders = ['DatabaseSeeder'];
protected bool $usesOwnTxn = false;
public function testSomething1() { $this->assertTrue(true); // "accidentally" commit the transaction // \DB::commit(); } public function testSomething2() { $this->assertTrue(true); } }

The PrepareDatabase trait has been updated to pass these and other settings needed to determine the "scenario" to the DatabaseBuilder:

<?php
// tests/Feature/PrepareDatabase.php

namespace Tests\Feature;

trait PrepareDatabase
{

//  
protected array $seeders = [];
//
protected bool $usesOwnTxn = false;
/** * @before */ public function boot(): void { $this->afterApplicationCreated( fn() => $this->prepareConnection() ); } private function prepareConnection(): void { $connection = config('database.default'); $builder =
$this->newBuilder($connection);
$builder->start(); $this->beforeApplicationDestroyed( fn() => $builder->finish() ); } private function
newBuilder
(string $connection): DatabaseBuilder
{ return new DatabaseBuilder(
database_path('migrations'),
database_path('seeders'),
database_path('factories'),
$connection,
$this->getProperty('seeders', []),
$this->getProperty('usesOwnTxn', false)
); } private function
getProperty(string $name, $default = null)
{ return property_exists(static::class, $name) ? $this->$name : $default; } }

This version of the DatabaseBuilder generates "build" and "situation" hashes, and uses them in the database name. (It also stores them in the meta-data table as an extra safety check).

Now that we're expecting some tests to use their own transactions, it also stores a new reuse_allowed flag in the meta-data table - and is turned off for tests that do.

<?php
// tests/Feature/DatabaseBuilder.php

namespace Tests\Feature;

use Artisan;
use DB;
use Illuminate\Database\ConnectionInterface;
use Illuminate\Support\Collection;
use PDO;
use RecursiveDirectoryIterator;
use RecursiveIteratorIterator;
use stdClass;
use Throwable;

class DatabaseBuilder
{
    private string $migrationsPath;
    private string $seedersPath;
    private string $factoriesPath;
    private static ?string $buildHash = null;

    private string $connection;
    private array $seeders;
    private bool $canReuseDatabase;

    private string $database;
    private ?PDO $pdo = null;

    public function __construct(
        
string $migrationsPath,
string $seedersPath,
string $factoriesPath,
string $connection,
array $seeders,
bool $testUsesOwnTxn
)
{
$this->migrationsPath = $migrationsPath;
$this->seedersPath = $seedersPath;
$this->factoriesPath = $factoriesPath;
$this->connection = $connection;
$this->seeders = $seeders;
$this->canReuseDatabase = !$testUsesOwnTxn;
} public function start(): void { $this->chooseDB(); $this->prepareDB(); $this->startTransaction(); } public function finish(): void { $this->rollBackTransaction(); } private function chooseDB(): void { $key = "database.connections.$this->connection.database"; // choose a new name $this->database = collect([ config($key), // original db name
mb_substr($this->buildHash(), 0, 12),
mb_substr($this->situationHash(), 0, 12),
])->filter()->implode('_'); // use this database config([$key => $this->database]); } private function prepareDB(): void { if ($this->dbIsReusable()) { dump("REUSING the database: $this->database"); } else { dump("REBUILDING the database: $this->database"); $this->rebuildDB(); } } private function dbIsReusable(): bool { if (!$this->canReuseDatabase) { return false; } $row = $this->loadReuseRow(); if (!$row) { return false; }
if (!$row->reuse_allowed) {
return false; } if ($row->txn_committed) { return false; }
if ($row->build_hash != $this->buildHash()) {
return false; }
if ($row->situation_hash != $this->situationHash()) {
return false; } return true; } private function loadReuseRow(): ?stdClass { $row = null; try { $row = $this->directDB()->query( "SELECT * FROM `$this->database`.reuse_check LIMIT 0, 1", PDO::FETCH_CLASS, 'stdClass' )->fetchObject(); } catch (Throwable $e) { // the database might not exist yet } return $row ?? null; } private function rebuildDB(): void { $this->createDB(); $this->migrate();
$this->seedCustom();
$this->createMetaDataTable(); } private function createDB(): void { // always start fresh $this->directDB()->exec( "DROP DATABASE IF EXISTS `$this->database`" ); $this->directDB()->exec( "CREATE DATABASE `$this->database`" ); } private function migrate(): void { Artisan::call('migrate:fresh'); } private function
seedCustom():
void
{ collect($this->seeders)->each( fn (string $seeder) => Artisan::call('db:seed', ['--class' => $seeder]) ); } private function createMetaDataTable(): void { $this->laravelDB()->statement( "CREATE TABLE reuse_check ("
. "reuse_allowed TINYINT, "
. "txn_committed TINYINT, "
. "build_hash CHAR(32), "
. "situation_hash CHAR(32)"
. ")" ); $this->laravelDB()->insert( "INSERT INTO reuse_check ("
. "reuse_allowed, "
. "txn_committed, "
. "build_hash, "
. "situation_hash "
. ") " . "VALUES (
:reuseAllowed
, 0,
:buildHash
,
:scenarioHash
)"
, [
'reuseAllowed' => (int) $this->canReuseDatabase,
'buildHash' => $this->buildHash(),
'scenarioHash' => $this->situationHash(),
] ); } private function startTransaction(): void {
if (!$this->canReuseDatabase) {
return; } $this->laravelDB()->beginTransaction(); // this value will persist if committed $this->laravelDB()->insert( "UPDATE reuse_check SET txn_committed = 1" ); } private function rollBackTransaction(): void {
if (!$this->canReuseDatabase) {
return; } $this->laravelDB()->rollBack(); } private function
situationHash():
string
{ return md5(serialize([$this->seeders, $this->canReuseDatabase])); } private function
buildHash():
string
{ // internally cache this value to avoid // looking in the filesystem each time return static::$buildHash ??= $this->determineBuildHash(); } private function determineBuildHash(): string { $fileHashes = $this ->allFiles([ $this->migrationsPath, $this->seedersPath, $this->factoriesPath ]) ->mapWithKeys(fn($path) => [$path => md5_file($path)]) ->all(); return md5(serialize($fileHashes)); } private function allFiles(array $dirs): Collection { return collect($dirs) ->map(fn(string $dir) => $this->filesInDir($dir)) ->flatten(1) ->unique() ->sort(); } private function filesInDir(string $dir): Collection { $dirIterator = new RecursiveDirectoryIterator($dir); $fileIterator = new RecursiveIteratorIterator($dirIterator); $files = []; foreach ($fileIterator as $file) { $files[] = is_file($file->getPathname()) ? $file->getPathname() : null; } return collect($files)->filter()->sort(); } private function laravelDB(): ConnectionInterface { return DB::connection($this->connection); } private function directDB(): PDO { if (!is_null($this->pdo)) { return $this->pdo; } // connect to the database directly // without choosing a database yet $config = config( "database.connections.$this->connection" ); $driver = $config['driver']; // note only mysql is supported here $dsn = sprintf( "$driver:host=%s;port=%d", $config['host'], $config['port'] ); // the connection is closed when this object is destroyed return $this->pdo = new PDO( $dsn, $config['username'], $config['password'] ); } }

This will generate databases like the following for your tests:

test_database_347567bf82c6_54131f30b4ff
test_database_347567bf82c6_7871bdb1f3b5

Chapter 8 - Run your tests in parallel

Normally when your tests run, they run sequentially. You can get a speed boost by splitting the test suite up and running several at the same time.

ParaTest is a package that does this for you. It splits the workload by running tests in different processes, and shows the collated results afterwards.

# install ParaTest
composer require --dev brianium/paratest
# run your tests in parallel
./vendor/bin/paratest

When you run tests in parallel, you need to be careful of how your tests use the database because they might interfere with each other. (There are other types of things that may need looking into too).

Parallel tests, when sharing the same database between test-processes

If different test processes are going to access the same database, it needs to be built before any tests run.

Laravel's DatabaseTransactions trait hasn't been mentioned in Laravel's docs since RefreshDatabase was added in Laravel 5.5, however it still exists. And you can use it to wrap each test in a transaction.

Unlike RefreshDatabase, it doesn't build the database. So you'll need to do it yourself beforehand…

<?php
// tests/Feature/MyTest.php

namespace Tests\Feature;

use Illuminate\Foundation\Testing\DatabaseTransactions;
use Tests\TestCase;

class MyTest extends TestCase
{
    
use DatabaseTransactions;
… }
# build the database first
php artisan migrate:fresh --env=testing --seed
# run your tests in parallel
./vendor/bin/paratest

Tim MacDonald describes this same process in parts 4, 5 & 6 of this article.

This is very effective and will speed things up a lot. But it can only be used by tests that use the same scenario. And also when your tests are wrapped within a transaction successfully. Otherwise, processes will start seeing the data left by other tests.

Any tests that aren't safe to run this way could be isolated by putting them in a @group (in the test method's docblock) or vice-versa, and tested separately.

When running your tests, the --group and --exclude-group options can be used to include or exclude them. Your unsafe tests would need to use the DatabaseMigrations trait (so the database is built fresh each time).

# build the test database
php artisan migrate:fresh --env=testing --seed
# run the tests that can safely use the same database
./vendor/bin/paratest --group=parasafe
# run the rest of the tests separately
./vendor/bin/phpunit --exclude-group=parasafe

Parallel tests when using a separate database for each test-process

ParaTest gives us another option which allows us to use a separate database for each process. It sets an environment variable called TEST_TOKEN, which contains a unique value per-process.

You can check this value when your tests run using getenv('TEST_TOKEN'), and include it in your scenario's "situation". Or simply append it to the database name.

A tricky problem that separate databases also helps avoid is deadlocks.

As I've mentioned a number of times so far, Laravel 8.25 added ParaTest support and can now run your tests in parallel.

It picks a separate database name for each process by appending the TEST_TOKEN, and also creates it for you.

# run your tests in parallel
php artisan test --parallel

Example Code: Build a database for each process when parallel testing

We can update our example by getting the DatabaseBuilder to also add the TEST_TOKEN to make the database names unique:

<?php
// tests/Feature/DatabaseBuilder.phpclass DatabaseBuilder
{
    …

    private function chooseDB(): void
    {
        $key = "database.connections.$this->connection.database";

        // choose a new name
        $this->database = collect([
            config($key), // original db name
            mb_substr($this->buildHash(), 0, 12),
            mb_substr($this->situationHash(), 0, 12),
            
(string) getenv('TEST_TOKEN'),
])->filter()->implode('_'); // use this database config([$key => $this->database]); } … }

This will generate databases like the following for your tests:

test_database_347567bf82c6_54131f30b4ff_1
test_database_347567bf82c6_54131f30b4ff_2
test_database_347567bf82c6_7871bdb1f3b5_1
test_database_347567bf82c6_7871bdb1f3b5_2

Chapter 9 - Browser tests

Browser tests (like when using Laravel Dusk) use a real web-client to interact with your website. This involves instructing a browser (often headless) to load pages from your site. They're great for end-to-end type testing that makes sure things work as the user would see them.

Your website can be tested at different window sizes, including the dimensions of phones and tablets. You can even take screenshots.

A different approach needs to be taken when building databases for these tests. This is because the data needs to persist in the database for each test to run, and a transaction won't let us do this.

The external http request/s from the browser will be handled by a different PHP process, distinct from the process running your test. The test and this process both need access to the same database and data.

For this reason, you should not try to wrap these tests in a transaction. And because of this, changes will be saved, leaving the database in an unclean state.

The database will need to be rebuilt for each test.

Because of this overhead, you might like to look at incorporating several assertions together into the same test. For example, testing a set of related steps that a user might go through, instead of creating a test for each step.

Laravel gives you the DatabaseMigrations trait which builds the "default" connection's database fresh for every test. It empties the database and runs your migrations.

<?php
// tests/Browser/MyTest.php

namespace Tests\Browser;

use Illuminate\Foundation\Testing\DatabaseMigrations;
use Tests\DuskTestCase;

class MyTest extends DuskTestCase
{
    
use DatabaseMigrations;
… }

Laravel's dusk artisan command sets up the testing environment using your .env.dusk.local env file, and runs your dusk browser tests.

# run the browser tests
php artisan dusk

There's no getting around the fact that the database needs to be rebuilt for each browser test, but I describe a way to get more out of them in the next chapter


Chapter 10 - Parallel browser tests

Browser tests sometimes take the largest portion of time in a test suite. It's hard to speed them up as you can't use some of the good methods that you'd use otherwise. Transactions and re-using databases are out.

Being able to run them in parallel would be the best way to get the most speed out of them.

As mentioned above, browser test data needs to persist for the test to run. The test and the process handling the browser request both need access to the same database.

There's also a disconnect between the testing code, and the code handling the browser requests…

In Laravel when using Dusk, there's a balancing act where Dusk's .env.dusk.local file - which contains settings used when responding to browser requests - is copied over the regular .env file. The regular .env file is copied back afterward, but Dusk takes over the environment for the duration of the tests.

This situation doesn't allow the process responding to the browser request to know anything special about how the test-database/s were set up. So the usual way to get these to line up is to use a single database, and run the tests one at a time.

This would be a lot easier if there was a way to let the script answering the browser request to know which database/s to use. In fact, it would be quite useful if it could be told about all of the configuration the test used. Letting you kind of ensure the two environments one and the same.

This chapter focuses on this problem: how to get these different parts of this process to use the same environment.

Sharing config settings doesn't strictly relate to sharing database details. It will also be useful when browser testing a website that doesn't use a database.

There may be several creative ways to solve this problem. Here's one that I came up with for my package

Sharing configuration when browser testing

There is a way of injecting information into the process of making test-requests to your website. When you create the browser instance that will make the requests, your test can set cookies

Step 1 - Store a copy of the config, and pass its details in a cookie

Cookies can be used to pass information from your test to the process that handles browser requests.

There's a limit to the size of cookies of a few thousand bytes, depending on the browser. That's not going to be big enough to pass the environment's full config settings through. But you could store a snapshot of that in the filesystem, and pass a reference to it.

The Chrome browser (used by Dusk) will stop you from setting cookies until it's loaded a page from the site you're setting cookies for. So you'll need to make an initial request, perhaps to an empty page on your site first.

After saving the file and setting the cookie, you'll need to pick it up on the other side, when the app boots up to handle the browser's request…

Step 2 - Use the config on page-load

In Laravel, you can add Middleware which wraps itself around the website's request/response process. Middleware can choose to do something before the request is handled, or after. Or both.

Adding some middleware would let you look for this cookie at the beginning of the request process, and load the config settings it refers to.

Note: You will need to be very careful that this middleware only runs in local and testing environments. It would be unsafe to have something like this in a production website.

By default, Laravel encrypts its cookies, so you may need to encrypt your cookie as well. Or, you could set the cookie in plain text and have your middleware read it before Laravel's EncryptCookies middleware does.

You'll need to be careful of website code that removes your cookie. To combat this, you could re-set the cookie in the response on the way out of the middleware.

Step 3 - Clean-up

Afterwards, you should remove the temporary config files you've created.

You could do this is by registering a callback via Laravel's beforeApplicationDestroyed(), which will then run after each test.

Another place you could do this is during the test's clean-up. PHPUnit will run a test class' tearDown() method (if it exists), after the test has run.

And lastly, you can create a method and add an @after docblock annotation. PHPUnit looks for these as well.

Note: It's important to make sure these temporary config files aren't committed to your repository. If a test stops prematurely, your clean-up won't remove it.

If you're using git, you can tell it to ignore them using a .gitignore file. On top of this, you could think about writing a process to remove them later.

Example Code: Passing Laravel's config via the browser

Here we have a new MyDuskTest test that builds a $browser object, and calls a new shareConfig($browser) method (that PrepareDatabase makes available):

<?php
// tests/Browser/MyDuskTest.php

namespace Tests\Browser;

use App\Models\User;
use Laravel\Dusk\Browser;
use Tests\DuskTestCase;
use Tests\Feature\PrepareDatabase;

class MyDuskTest extends DuskTestCase
{
    
use PrepareDatabase;
protected array $seeders = ['DatabaseSeeder'];
public function testDusk1() { $this->browse(function (Browser $browser) {
$this->shareConfig($browser);
$user = User::factory()->create(); $browser->visit('/')->assertSee("$user->id $user->name"); }); } }

Let's add this new shareConfig() method to our PrepareDatabase trait. It will write the current config settings to file, and set a cookie in the browser containing its location.

It also has a new method removeConfigFiles() that's automatically called after the test has run:

<?php
// tests/Feature/PrepareDatabase.php

namespace Tests\Feature;

use Config;
use Laravel\Dusk\Browser;
use Tests\DuskTestCase;

trait PrepareDatabase
{
    
private array $sharedConfigFiles = [];
// protected array $seeders = []; // protected bool $usesOwnTxn = false; /** * @before */ public function boot(): void { $this->afterApplicationCreated( fn() => $this->prepareConnection() ); } private function prepareConnection(): void { $connection = config('database.default'); $builder = $this->newBuilder($connection); $builder->start(); $this->beforeApplicationDestroyed(function () use ($builder) { $builder->finish();
$this->removeConfigFiles();
}); } private function newBuilder(string $connection): DatabaseBuilder { return new DatabaseBuilder( database_path('migrations'), database_path('seeders'), database_path('factories'), $connection, $this->getProperty('seeders', []), $this->getProperty('usesOwnTxn', false),
$this->isDuskTest()
); } private function getProperty(string $name, $default = null) { return property_exists(static::class, $name) ? $this->$name : $default; } private function
isDuskTest():
bool
{ return $this instanceof DuskTestCase; } protected function
shareConfig(Browser $browser):
void
{ // store the current config in the filesystem $tempConfigFile = 'temp-config-' . mt_rand() . '.php'; $content = '<?php' . PHP_EOL . 'return ' . var_export(Config::all(), true) . ';' . PHP_EOL;
file_put_contents($tempConfigFile, $content);
$tempConfigFile = realpath($tempConfigFile); $this->sharedConfigFiles[] = $tempConfigFile; // set the config cookie
$browser->visit('/empty-page');
$browser->addCookie(
'config_path', base64_encode($tempConfigFile), $expiry = null, $options = [], $encrypt = false ); } private function
removeConfigFiles():
void
{ foreach ($this->sharedConfigFiles as $path) { unlink($path); } } }

Here's the full DatabaseBuilder class for completeness. The only addition is the $isDuskTest parameter passed to the constructor:

<?php
// tests/Feature/DatabaseBuilder.php

namespace Tests\Feature;

use Artisan;
use DB;
use Illuminate\Database\ConnectionInterface;
use Illuminate\Support\Collection;
use PDO;
use RecursiveDirectoryIterator;
use RecursiveIteratorIterator;
use stdClass;
use Throwable;

class DatabaseBuilder
{
    private string $migrationsPath;
    private string $seedersPath;
    private string $factoriesPath;
    private static ?string $buildHash = null;

    private string $connection;
    private array $seeders;
    private bool $canReuseDatabase;

    private string $database;
    private ?PDO $pdo = null;

    public function __construct(
        string $migrationsPath,
        string $seedersPath,
        string $factoriesPath,
        string $connection,
        array $seeders,
        bool $testUsesOwnTxn,
        
bool $isDuskTest
)
{ $this->migrationsPath = $migrationsPath; $this->seedersPath = $seedersPath; $this->factoriesPath = $factoriesPath; $this->connection = $connection; $this->seeders = $seeders; $this->canReuseDatabase = !$testUsesOwnTxn
&& !$isDuskTest
; } public function start(): void { $this->chooseDB(); $this->prepareDB(); $this->startTransaction(); } public function finish(): void { $this->rollBackTransaction(); } private function chooseDB(): void { $key = "database.connections.$this->connection.database"; // choose a new name $this->database = collect([ config($key), // original db name mb_substr($this->buildHash(), 0, 12), mb_substr($this->situationHash(), 0, 12), (string) getenv('TEST_TOKEN'), ])->filter()->implode('_'); // use this database config([$key => $this->database]); } private function prepareDB(): void { if ($this->dbIsReusable()) { dump("REUSING the database: $this->database"); } else { dump("REBUILDING the database: $this->database"); $this->rebuildDB(); } } private function dbIsReusable(): bool { if (!$this->canReuseDatabase) { return false; } $row = $this->loadReuseRow(); if (!$row) { return false; } if (!$row->reuse_allowed) { return false; } if ($row->txn_committed) { return false; } if ($row->build_hash != $this->buildHash()) { return false; } if ($row->situation_hash != $this->situationHash()) { return false; } return true; } private function loadReuseRow(): ?stdClass { $row = null; try { $row = $this->directDB()->query( "SELECT * FROM `$this->database`.reuse_check LIMIT 0, 1", PDO::FETCH_CLASS, 'stdClass' )->fetchObject(); } catch (Throwable $e) { // the database might not exist yet } return $row ?? null; } private function rebuildDB(): void { $this->createDB(); $this->migrate(); $this->seedCustom(); $this->createMetaDataTable(); } private function createDB(): void { // always start fresh $this->directDB()->exec( "DROP DATABASE IF EXISTS `$this->database`" ); $this->directDB()->exec( "CREATE DATABASE `$this->database`" ); } private function migrate(): void { Artisan::call('migrate:fresh'); } private function seedCustom(): void { collect($this->seeders)->each( fn (string $seeder) => Artisan::call('db:seed', ['--class' => $seeder]) ); } private function createMetaDataTable(): void { $this->laravelDB()->statement( "CREATE TABLE reuse_check (" . "reuse_allowed TINYINT, " . "txn_committed TINYINT, " . "build_hash CHAR(32), " . "situation_hash CHAR(32)" . ")" ); $this->laravelDB()->insert( "INSERT INTO reuse_check (" . "reuse_allowed, " . "txn_committed, " . "build_hash, " . "situation_hash " . ") " . "VALUES (:reuseAllowed, 0, :buildHash, :scenarioHash)", [ 'reuseAllowed' => (int) $this->canReuseDatabase, 'buildHash' => $this->buildHash(), 'scenarioHash' => $this->situationHash(), ] ); } private function startTransaction(): void { if (!$this->canReuseDatabase) { return; } $this->laravelDB()->beginTransaction(); // this value will persist if committed $this->laravelDB()->insert( "UPDATE reuse_check SET txn_committed = 1" ); } private function rollBackTransaction(): void { if (!$this->canReuseDatabase) { return; } $this->laravelDB()->rollBack(); } private function situationHash(): string { return md5(serialize([$this->seeders, $this->canReuseDatabase])); } private function buildHash(): string { // internally cache this value to avoid // looking in the filesystem each time return static::$buildHash ??= $this->determineBuildHash(); } private function determineBuildHash(): string { $fileHashes = $this ->allFiles([ $this->migrationsPath, $this->seedersPath, $this->factoriesPath ]) ->mapWithKeys(fn($path) => [$path => md5_file($path)]) ->all(); return md5(serialize($fileHashes)); } private function allFiles(array $dirs): Collection { return collect($dirs) ->map(fn(string $dir) => $this->filesInDir($dir)) ->flatten(1) ->unique() ->sort(); } private function filesInDir(string $dir): Collection { $dirIterator = new RecursiveDirectoryIterator($dir); $fileIterator = new RecursiveIteratorIterator($dirIterator); $files = []; foreach ($fileIterator as $file) { $files[] = is_file($file->getPathname()) ? $file->getPathname() : null; } return collect($files)->filter()->sort(); } private function laravelDB(): ConnectionInterface { return DB::connection($this->connection); } private function directDB(): PDO { if (!is_null($this->pdo)) { return $this->pdo; } // connect to the database directly // without choosing a database yet $config = config( "database.connections.$this->connection" ); $driver = $config['driver']; // note only mysql is supported here $dsn = sprintf( "$driver:host=%s;port=%d", $config['host'], $config['port'] ); // the connection is closed when this object is destroyed return $this->pdo = new PDO( $dsn, $config['username'], $config['password'] ); } }

This is the BrowserTestMiddleware that looks for the cookie, and loads the corresponding config:

<?php
// app/Http/Middleware/BrowserTestMiddleware.php

namespace App\Http\Middleware;

use Closure;
use Config;
use Illuminate\Http\Request;
use Illuminate\Http\Response;

class BrowserTestMiddleware
{

    public function handle(Request $request, Closure $next)
    {
        // safety check - we definitely don't
        // want this to run in production
        if (!app()->environment('local', 'testing')) {
            return $next($request);
        }

        // check for the cookie - continue if not present
        $configPath = 
$this->detectConfigPath($request);
if (!$configPath) { return $next($request); } // use the config and re-set the cookie afterwards
$this->importConfig($configPath);
$response = $next($request);
$this->reSetCookie($response, $configPath);
return $response; } public function detectConfigPath(Request $request): ?string { return base64_decode( $request->cookie('config_path'), true ) ?: null; } public function importConfig($configPath): void { $this->resetConfig(); Config::set(require $configPath); } public function resetConfig(): void { foreach (array_keys(Config::all()) as $index) { Config::offsetUnset($index); } } public function reSetCookie($response, string $configPath): void { if ($response instanceof Response) { $response->cookie( 'config_path', $configPath, null, // minutes '/', // path null, // domain false, // secure false // httpOnly ); } } }

This middleware needs to be added to Laravel's middleware-groups. This BrowserTestServiceProvider example inserts the middleware before any others - including Laravel's EncryptsCookies.

A route is also added for your browser to load so it can set its cookie.

<?php
// app/Providers/BrowserTestServiceProvider.php

namespace App\Providers;

use App\Http\Middleware\BrowserTestMiddleware;
use Illuminate\Routing\Router;
use Illuminate\Support\ServiceProvider;

class BrowserTestServiceProvider extends ServiceProvider
{
    public function boot(Router $router): void
    {
        $this->initialiseMiddleware();
        $this->initialiseRoutes($router);
    }

    /**
     * Initialise the middleware.
     *
     * @return void
     */
    protected function initialiseMiddleware(): void
    {
        if ($this->app->runningInConsole()) {
            return;
        }
        if (!$this->app->environment('local', 'testing')) {
            return;
        }
        
        /** @var HttpKernel $httpKernel */
        $httpKernel = $this->app->make(HttpKernel::class);
        $httpKernel->prependMiddleware(BrowserTestMiddleware::class);
    }

    /**
     * Initialise the routes.
     *
     * @param Router $router Laravel's router.
     * @return void
     */
    protected function initialiseRoutes(Router $router): void
    {
        if ($this->app->runningInConsole()) {
            return;
        }
        if (!$this->app->environment('local', 'testing')) {
            return;
        }

        // The path that browsers connect to initially (when
        // browser testing) so that cookies can then be set (the
        // browser will reject new cookies before it's loaded a
        // webpage) this route bypasses all middleware
        $router->get('empty-page', fn() => '');
    }
}

You can register this service-provider in your config/app.php file:

<?php
// config/app.php'providers' => [
        …
        App\Providers\BrowserTestServiceProvider::class,
    ],

And lastly, let's add the route that lists the users in the database. This is the page that the test accesses:

<?php
// routes/web.php

use App\Models\User;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\Route;

Route::get('/', function (Request $request) {
    return User::get()
        ->map(fn($user) => "$user->id $user->name")
        ->implode('<br />' . PHP_EOL);
});

Finally

By incorporating config-sharing like this, along with scenarios and parallel-testing, you'll be able to run them in parallel!

With this config-sharing in place, you don't need to run php artisan dusk to run your Dusk tests. Just include them in your normal test-run. You can remove your .env.dusk.local file altogether.

You can run them sequentially by specifying the top directory that contains all of your tests:

# tell phpunit where to look for your tests ("tests" directory)
php artisan test tests

Or alternatively, and more usefully, you could add the tests/Browser directory as a PHPUnit test suite, so it's included when you run your tests:

// phpunit.xml
<testsuites><testsuite name="Browser">
        <directory suffix="Test.php">./tests/Browser</directory>
    </testsuite>
</testsuites>
# run the tests like normal
php artisan test
# or
php artisan test --parallel

Chapter 11 - Use a faster database engine - SQLite

Now for database related things you can do outside of your tests. The following chapters discuss things related to the type of database you're using.

SQLite is a file-based database engine that's designed to be fast. It's often quicker than databases that websites normally use.

Laravel lets you connect to a SQLite database by choosing the 'sqlite' connection in your config, and specifying the database filename.

# .env.testing
DB_CONNECTION=sqlite
DB_DATABASE=path/to/test_database.sqlite

When creating a new SQLite database, you'll need to start with an empty file. You can simply touch a new file before connecting to it.

Note: Something you need to be careful about is that SQLite doesn't support all the features of other databases. If you're using it as a stand-in for another type of database, some things won't work. This might not be an issue for you, it depends on how your application uses the database.

To be safe, you should consider running your tests with the same type of database that you use in production. Your confidence in the tests is very important.


Chapter 12 - Testing databases in memory

Accessing data in memory is a lot faster than accessing it from a disk-based filesystem. SSD hard drives improve this, but memory is still faster. If you can store your database in memory, you might get a boost in speed…

Use a SQLite memory database

SQLite has an option that lets you create databases in memory instead of in file storage. These run quicker as they don't access the filesystem.

To use a SQLite memory database, just specify the database-name :memory: instead of a filename, and a database will be created in memory for you.

# .env.testing
DB_CONNECTION=sqlite
DB_DATABASE=:memory:

There's a limitation however that makes this option less attractive.

The drawback with SQLite memory databases is they only exist for the current process. Each connection gets its own memory database.

Laravel builds its environment from scratch for each test, causing a new connection to the database to be made each time. Because of this reconnection, every test gets its own new (and empty) memory database. The migrations and seeders need to be rebuilt for every test, which often makes it less efficient than a persistent database that can be reused.

Along the same lines, this also means that you can't use them for browser testing (where two processes needs to access the database at the same time).

Run your database from a memory filesystem

You could approach this from a different angle and run your regular database on a memory filesystem. This involves getting your database (like MySQL, PostgreSQL etc) to store its data in a memory partition.

Files stored in a memory filesystem will exist until the server is rebooted, so your databases will stick around between tests.

Linux has several memory-based filesystems that you could look at using. macOS and Windows have options available as well.

As an example, tmpfs can be used when running your MySQL tests in a Docker environment.

# docker-compose.yml
version: '3'
services:
  mysql:
    image: mysql:8.0
    
tmpfs:
- /var/lib/mysql
environment:

I've seen this method give a 6x improvement in database build time(!).

This is a great way to get extra speed while using the same type of database that you use in production. I recommend this as the best approach when using databases in memory.


Chapter 13 - Simulating your database

There's a concept that's fairly new (in the scheme of things), that involves swapping out your database engine for a library in your codebase that does the same thing. For example, Vimeo have released an open-source package to simulate MySQL in PHP.

This may only be suitable for some projects, but it's interesting to be aware of.

Whatever the case, it falls directly in the same category as swapping to faster database engines. It may be faster, but be aware that it's not a true replacement for your database. As before, I would recommend you stick with the same type of database you use in production.


In closing

There certainly are a lot of ins-and-outs surrounding this concept. I hope I've helped them make more sense, and introduced some new ideas.

If you liked the things discussed in this book, please have a look at my Laravel package - Adapt - A Database Preparation Tool which combines many of the ideas discussed here.

Please send me your feedback or questions, and let me know if there's anything that should be added or tweaked.

I'd like to say a special thanks to Jason McCreary and Kai Sassnowski for their valuable feedback which helped shape this book.

Previous: Adapt - Package Documentation