dateo. Coding Blog

Coding, Tech and Developers Blog

.NET
Rider
unit test
memory

Have you ever memory-profiled in a unit test?

Dennis Frühauff on October 8th, 2024

Today's article will be a short but hopefully useful one: Ever had to memory profile code during unit test execution?
Thought so. Let's take a look at how that works.


Introduction

I am sure we've all come cross major or minor memory leaks in our application(s). If you are a .NET developer memory management is not something that
we tend to think of all the time. Most of the time, we can just trust the garbage collector to do its work. And, also most of the time, it does this pretty nicely.
This, i.e., living in the ecosystem of a memory-managed framework, does not mean that you cannot write memory-leaking code.
Quite the opposite: .NET developers forgetting to think about it can lead to pretty ugly bugs.


And if one of those pops up, our sharpest tool in the workshop is our favorite IDE's memory profiling tool, which usually let's us run the application with a profiler attached and take dedicated snapshots that we can compare against each other. If you know what to look for, chances are you'll find the leak very soon.


But have you ever memory-profiled a unit test? Do you think it is possible?


Until a few weeks ago, my answer would have been "Of course that's possible. Just right-click and run with a profiler".
Unfortunately, that is not the case.


Screenshot 1


But first of all: Why would you want to do that?


  • You have a suspicion of the memory leak in your code and want to prove that in a isolated manner.
  • You want to guard your application against unintentional memory leaks in advance.
  • The test code itself is leaking memory, potentially blowing your CI pipeline when the test number increases.

Memory-profiling unit tests in JetBrains Rider

By default, Jetbrains Rider offers to profile unit tests on a time-based manner. That means, that you can analyze timings and execution times in your code.
But you cannot just analyze the memory profile of a test.


What you need to do is download a dedicated (and already pretty old) NuGet package JetBrains.DotMemoryUnit into your test project.
After doing this, you will see a new test run option in your IDE.


Screenshot 2


Once that is complete, you can start writing memory assertions directly in your tests:


[Fact]
public void Foo()
{
    dotMemory.Check(memory =>
    {
        memory
            .GetObjects(q => q.Namespace.Like("Tests"))
            .ObjectsCount
            .Should().Be(0);
    });
}

If you are running your tests with xUnit you will face a runtime exception:


DotMemoryUnitException
xUnit does not capture the standard 
output stream which is used by dotMemory 
Unit to report issues and save workspaces.

Fortunately, the package provides you with an API to work around this, so that a full example test class would look like this:


using FluentAssertions;
using JetBrains.dotMemoryUnit;
using Xunit;
using Xunit.Abstractions;

namespace MyTests;

public class FirstTestClass
{
    public FirstTestClass(ITestOutputHelper output)
    {
        DotMemoryUnitTestOutput.SetOutputMethod(output.WriteLine);
    }
    
    [Fact]
    public void Foo()
    {
        dotMemory.Check(memory =>
        {
            memory
                .GetObjects(q => q.Namespace.Like("MyTests"))
                .ObjectsCount
                .Should().Be(0);
        });
    }
}

This package or, rather, extension to your test runner, lets you:


  • Assert and filter for objects within snapshots,
  • Assert comparisons between different memory snapshots within your test,
  • Analyze memory traffic allocated in your code,
  • Perform continuous analysis by writing automated dumps for failed tests.

Conclusion

I told you this was a short one. Maybe there are a few readers of this post who can put this to use like I had to recently.
And maybe I could save you some time by pointing you in the right direction.


P.S.: I am not aware about the capabilites of Visual Studio when it comes to this topic. From what I know at this moment, I would assume that there is also no out-of-the-box way to do it, but I did not explicitly check this.



Please share on social media, stay in touch via the contact form, and subscribe to our post newsletter!

Be the first to know when a new post was released

We don’t spam!
Read our Privacy Policy for more info.

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.