Are Mocks/In-Memory Tests a Good Idea?

Integration testing examines if the parts of an application can communicate and function together–and if so, how well. There are many approaches to a successful integration test, but the commonly used “mocks/in-memory” testing method may not provide the same quality results that it does when working on the unit testing level.

Developers may look at using in-memory testing to save time and perform more tests, but this approach removes much of what’s important in an integration test. Integration testing approaches that utilize the application’s actual database hardware infrastructure or simulated recreation provide much more reliable, real-world, easily measurable results.

Integration Testing with Mocks/In-Memory

Businesses often go with mocks/in-memory testing in the development process because it’s faster. Utilizing mocked in-memory, or RAM-stored, database information for a test essentially eliminates seek-time delays associated with using a disk-based database. This method is extremely popular for unit testing because the tests are self-contained. If your application is already running on an in-memory database, then this method will provide accurate results.

However, in-memory tests utilize fewer CPU instructions and don’t always support the same SQL functions the actual testing database uses. Therefore, you won’t get accurate performance data. Repeated delays may add up to performance issues which would show up in real-world use, but not on the in-memory test.

Swapping in real data parts in place of information coming from an actual server is not only problematic because of its inaccurate performance results, but also because these tests are difficult to maintain.  Additionally, using in-memory testing to initiate and load multiple tests at once can cause problems. While this method saves time, it can cause tests to overlap each other and interfere with results.

Problems arise with in-memory testing when it doesn’t accurately represent the production environment, which makes it difficult to recommend for the integration testing step. It does not make a lot of sense to work around these additional issues when the test itself already fails to provide accurate results.

Using an Actual Database

The most accurate results come from utilizing the actual hardware database infrastructure in the test. If the actual database is unavailable or not yet implemented, the development team can use a similar database server to run the test. This method eliminates all performance gains from the in-memory tests and produces tangible, real-world results. Additionally, running the test from an alternative data source means the test results won’t provide the same level of insight into the application architecture as using a disk-based database.

However, tests will take much longer to perform when running on a disk-based platform compared to an in-memory platform, meaning the development team will not be able to conduct as many tests within the same time period. This can be a drawback for regular tests, but the importance of the integration testing step in the development process means it’s a wise investment to get it right. Development teams can improve test times by implementing more efficient database caching techniques for the test.

The main purpose of integration testing is to measure the interface between different parts of the application. Many of these parts may require making calls to the database, which takes time and is subject to communication problems, so replacing the actual database with an in-memory solution tests a platform that performs differently.

If your business is looking to improve its application development practices, don’t hesitate to contact the integration testing experts at Apica.