On January 4, 2017, I described a programming project that I would work on this year. That project is Time Tortoise, a Windows 10 app for time tracking. In this final post of 2017, I’ll review what I learned this year from the Time Tortoise project.
Technologies, Patterns, and Tools
To learn a technology, it’s not enough just to read about it or watch a video. You need to build something to appreciate the details of a technology and resolve the implementation challenges that aren’t covered in summaries and tutorials. Here are the technologies used in Time Tortoise:
Universal Windows Platform (UWP)
As of early 2017, UWP was the newest technology for Windows 10 app development, so I decided to try it out for Time Tortoise. However, I designed things so that most of the Time Tortoise code doesn’t depend on UWP.
The diagram at the top of the linked article has a Universal Windows Platform box surrounding the View, ViewModel, and Model components. Based on what I know now, I think it would be more accurate to only associate the View with UWP. The remaining components consist of Class Library projects that can be called from other clients, like the console app that I wrote for testing purposes, and the xUnit.net automated unit tests. They aren’t specific to UWP, though they are compatible with it.
So for Time Tortoise, UWP is used for the user interface, and the app itself is a Windows 10 UWP app. But most of the code doesn’t know anything about UWP. I found this to be a useful design choice over the course of the year.
Minimizing the amount of code in the View is a desired outcome of the MVVM design pattern. MVVM was created for Windows Presentation Foundation (WPF), an older UI technology with similarities to UWP.
Separating the Model and ViewModel from the View improves testability and makes an app less dependent on a particular UI technology. MVVM also simplifies data binding, a technology that keeps the UI up to date without the app developer needing to get involved in the implementation details.
Extensible Application Markup Language (XAML)
XAML is the language used to describe the user interface of a UWP or WPF application. It’s analogous to HTML for web pages.
Besides the basic XAML required for the Time Tortoise UI, I also spent a week on XAML input validation, which provides feedback to the user if they enter an unexpected value in a textbox.
Entity Framework Core (EF Core)
EF Core, the open-source version of Entity Framework, is an object-relational mapping (ORM) framework. Although database queries are a small part of the Time Tortoise codebase, I had an opportunity to get into the fundamentals of EF, including how to load an entity along with its related entities.
Since Time Tortoise is a single-user app, SQLite, a popular embedded database, is a good choice for data storage. Most of the SQLite details are hidden by EF Core, but I found it useful to occasionally browse the database directly. I also resolved some issues related to the fact that SQLite is written in C (known as “unmanaged code” in the .NET world), and doesn’t always work perfectly in .NET applications.
C# and Visual Studio 2017 (VS2017)
When I started the Time Tortoise project, Visual Studio 2015 was the current released version. A few months in, I upgraded to VS2017. I wasn’t looking for any particular features in the upgrade. But especially when working with a new technology like UWP, it’s best to stick with the latest tools.
SignalR is a library that allows .NET applications to use network sockets without having to handle low-level details. I use it to communicate between the main Time Tortoise app and Time Tortoise Companion, a WPF app that handles features that aren’t allowed in UWP (more on that in the Program Features section below).
MouseKeyHook is a library that exposes global mouse and keyboard events to .NET apps. Time Tortoise Companion uses it to detect whether the user is currently active or idle, which is an important distinction for a time tracking app.
Time Tortoise uses text files for some features, as a supplement to the UWP user interface. I selected YAML as the most friendly input format, and YamlDotNet to deserialize YAML files into C# objects.
.NET Core and .NET Standard
.NET Core is an open source, cross-platform version of the .NET Framework. It implements a subset of the classic .NET Framework, and this subset is compatible with UWP, so .NET Core class libraries can be consumed by UWP projects.
.NET Standard is a specification, not an implementation. However, many .NET implementations, including .NET Core, target the .NET Standard. Visual Studio offers both .NET Core and .NET Standard class libraries. The latter is a subset of the former. Since I haven’t run into anything I need in .NET Core that isn’t in .NET Standard, I chose to use .NET Standard class libraries.
For most of the year, Time Tortoise used Portable Class Libraries, since they worked best with both UWP and xUnit.net. However, I eventually upgraded to .NET Standard 1.4, and then to .NET Standard 2.0, the most current release as of the end of this year.
One of my goals when developing Time Tortoise was to write unit tests as I wrote the app code. That ended up taking a lot of development time, but it also provided design and quality benefits, along with more opportunities for learning. In this section, I’ll go over the technologies, patterns, and tools specific to unit testing.
xUnit.net is a third-party test framework that is actively developed, has good support in Visual Studio, and works with .NET Core/.NET Standard.
Unit tests vs. integration tests
The term unit test often refers to all tests written by developers. I use it that way for simplicity. But it’s more precise to distinguish between unit tests, which test an isolated part of the system, and integration tests, which test an end-to-end scenario. While both tests are useful, it’s important not to write an integration test when a unit test is sufficient, because unit tests run faster and are less likely to break.
For Time Tortoise, I have both unit tests and integration tests. A simple example of an integration test is one that accesses the database. These kinds of integration tests are important to ensure that database access code is tested. Fortunately, SQLite offers an in-memory database, which makes integration tests much faster. And when dealing with test data, it’s actually a benefit that the database disappears once the test completes.
To avoid hitting the database or another resource in a unit test, it’s necessary to inform the system that you’re running a unit test, not an integration test or the actual app. The cleanest way to do this is called dependency injection (DI). DI works by having each class constructor accept a list of its dependencies. For example, a class that needs to access the database could accept a data access layer repository class as one of its dependencies.
The reason to have classes accept dependencies is to allow unit tests to pass in mock dependencies, like a database repository that returns test data rather than data from a real database. This allows the unit test to focus on testing an isolated part of the system, without having to set up test data in a real database.
To create mock dependencies, I’m using a library called Moq, which accepts an interface and lets me specify how each member of that interface should behave. For example, I can specify the data that a mock database repository method should return.
A mock object doesn’t have any behavior by default. Since many tests don’t care about the specific behavior of a mock (when it’s unrelated to the unit that they’re testing), I found it helpful to write a helper class that sets up mocks consistently. That avoids duplicate code, and makes it easier to remember how mocks will behave.
In general, any dependency that the test doesn’t control needs to be mockable. Other examples include the system clock (because it’s non-deterministic) and the SignalR client (because it relies on the network and a server app).
Visual Studio offers a code coverage feature that keeps track of which lines of app code are executed by tests. This is a convenient way to identify code that still needs to be tested.
The code coverage system reports coverage results as a percentage. Teams that use code coverage typically set a minimum level that developers must adhere to. For example, “all code must be covered to at least 75%”. For Time Tortoise, I decided to target 100% coverage. This is more of a binary approach: complete coverage vs. less than complete coverage, rather than a target percentage. If the system is at less than complete coverage, it’s time to write more tests.
Testing tends to have diminishing returns, so I agree with the consensus opinion that it’s not cost-effective to target 100% coverage. However, it’s interesting to see what it takes to keep a project at that level. And it does avoid a common problem with testing: developers avoid writing tests for parts of the system that are difficult to test, and so design problems that are causing test difficulties don’t get fixed.
I am able to keep Time Tortoise at 100% coverage by using dependency injection, excluding generated code (that I don’t write myself), and spending a lot of time designing and writing tests.
The code coverage system was fairly reliable once I got it set up, but I did run into one problem that caused all .NET Standard projects to stop reporting coverage numbers. Fortunately, I found a simple fix.
The dependencies used in dependency injection need to be interfaces, not concrete classes. This is so unit tests can pass in mock objects, while app code passes in real objects that inherit from the same interfaces.
Unfortunately, many .NET Framework classes don’t inherit from interfaces, or they inherit from a combination of interfaces and concrete classes (which means not all members can be mocked). That’s where SystemWrapper comes in. It’s a library that provides interfaces and corresponding wrapper classes for .NET Framework classes.
The problem with using SystemWrapper for Time Tortoise is that SystemWrapper is based on the classic .NET Framework, not .NET Core or .NET Standard. I created my own version, SystemWrapperCore, with the few interfaces and wrappers that I needed for Time Tortoise. I’ll migrate additional code from SystemWrapper as required.
One aspect of unit testing took up much more time than I expected this year. The problem has to do with NuGet dependencies. NuGet is a convenient tool for managing external .NET dependencies. For example, Time Tortoise projects reference EF Core, SignalR, and even UWP itself, using the NuGet Package Manager. NuGet takes care of finding the appropriate assemblies (DLLs) for these dependencies, and copying them to the appropriate output directories.
For running the app, this process works fine. DLLs get copied properly, and the app runs. But for unit testing, I ran into a problem: The build process failed to copy dependencies to the unit test output directory, which means that unit tests aren’t discovered (don’t appear in the Visual Studio Test Explorer) or throw exceptions when run. Multiple Stack Overflow questions reported similar problems.
Over the course of the year, I came up with a partially automated process for getting DLLs copied to the right locations, but it ate up a lot of time from the Time Tortoise effort. Here’s how things happened, in chronological order:
- I first encountered the problem after I upgraded to Visual Studio 2017. I fixed it by manually copying files.
- I ran into the problem again when writing tests for the idle time detection feature.
- At this point, I wrote a two-part comprehensive explanation of the problem, to get a handle on the details.
- The problem popped up again after the upgrade to .NET Standard, because that upgrade affected the class library types used for unit test projects.
- One week, I tried an experiment based on information from an xUnit.net documentation page: apparently xUnit test projects are supposed to use .NET Core, not .NET Standard. This seemed to fix the dependency copying problem, but it broke my code coverage, so I kept things as-is with .NET Standard used for test projects.
- A tool called the Assembly Binding Log Viewer, which comes with Visual Studio, is essential for identifying missing dependencies. But its output files are inconvenient to analyze. To simplify things, I wrote an Assembly Binding Log Parser to partially automated the process of dependency resolution.
- Upgrading to .NET Standard 2.0 caused the problem to reappear. By this point I had tools and processes to resolve the known issues. However, I did find a new issue related to SQLitePCL.raw, a library that was adopted in .NET Standard 2.0. Since SQLite is not a .NET assembly, my existing tools didn’t help. But I was able to resolve the problem by using Process Manager to identify a missing DLL.
The dependency resolution saga is an example of a hidden cost of unit testing. It’s common knowledge that unit testing means writing test code in addition to app code, which takes extra time. But unit tests also require unit test infrastructure, which can fail independently of the app infrastructure. And since not every developer uses the unit test infrastructure (while they all the app infrastructure — otherwise there’s no product) it can be harder to resolve unit test problems, since they may been seen as less of a priority by tool vendors and the community.
Last week, I wrote a summary of current Time Tortoise functionality and ideas for future enhancements. Here’s the order that things happened over the course of the year:
Startup and Basics
- The initial commit, with a basic solution design.
- Add, save, and list activity names, which identify work to be timed. Changes to the View, ViewModel, Model, Data Access Layer (DAL), and Repository.
- Initial work for time segments, which record start and end times. Changes to the Model, ViewModel, Database, DAL, and tests.
- The time segment user interface. Using EF Core to save a parent object and its children (an Activity and its Time Segments).
- Add and delete time segments.
- Using Windows.UI.Xaml.DispatcherTimer to track elapsed time.
- Fit and finish and bug fixes.
- Packaging and installing UWP apps for self-hosting. (Part 2).
Time Tortoise Companion and Idle Time Detection
- A discussion of UWP limitations that affect idle time detection, a critical program feature.
- Designing a companion app to handle functionality that is not available to UWP.
- Using Hardcodet.NotifyIcon.Wpf to create a notification icon, the visual representation of the companion app running in the background.
- Using SignalR to send messages from Time Tortoise Companion to Time Tortoise.
- The idle time user interface, which indicates how long the user has been idle, and allows them to include or exclude their idle time.
Text File User Interface
- Designing a text file user interface to supplement the UWP graphical user interface.
- Reading and writing text files for the text file UI.
- Implementing a settings infrastructure using the text file UI.
- Implementing a daily summary report using the text file UI. (Part 2).
- A summary of the self-hosting process, which allows two versions of Time Tortoise to run in parallel, an independent version for self-hosting and a version running in Visual Studio for ongoing development.
- A short task backlog listing features required for effective self-hosting.
Some final thoughts from the Time Tortoise experience:
- Source control doesn’t capture everything that’s important about a software project. Especially in the area of unit tests, the Time Tortoise environment would often break, as described in the Resolving Dependencies section above. While it would be nice to have a stable and predictable environment, things don’t always work out that way. As an alternative, it’s good to have tools to get your environment back to a working state when something breaks. In some cases, it may even make sense to maintain a clean dev VM in a known good state, and just revert to it when necessary.
- Test-first development may not be practical when learning a new technology (which developers frequently do). Code coverage measurement can help ensure that all source code is nevertheless covered by tests.
- It’s easy for one class to gradually take on too much responsibility. The Time Tortoise
MainViewModelclass may be an example of this.
- Upgrading external components during development can take time and break things, but it’s still worth staying up to date with current technology, especially for a learning project.
- It’s useful to have a real UWP/.NET standard app to refer to. Tutorials are helpful, but it’s different to have a real app, with many environment-specific issues worked out.
Tracking the Time Tracker
During 2017, I spent:
- About 361 hours on Time Tortoise development, or about 59 minutes per day.
- About 233 hours on blog posts related to Time Tortoise, or about 38 minutes per day.
This is the end of Year 3 of the blog, and the first year I spent working on and writing about a single programming project. For a summary of posts from 2015, see Red-Green-Code: 2015 in Review and Summer Review. For 2016, see Red-Green-Code: 2016 in Review.
I have a new project planned for next year. For regular updates, subscribe to my email newsletter.
See you all next year!
(Image credit: A desert scene from southern New Mexico, where I spent some time this week)