27
45 Comments

Don't Write Tests For Your MVP

Not sure if this is an unpopular opinion or not, but I wanted to share it based on my experience. First, a little background.

I learned to code by building my first app, Graphite Docs. When I was learning to code, I didn't follow any best practices, didn't test a damn thing. I mean, I tested things manually, of course. But I didn't write any unit or integration tests. As Graphite progressed, and as I had to work on other things, I became a better developer. When working on projects that were not MVPs and not early-stage startup products, I wrote tests. I learned the value of testing.

But as I've been working on my new side project, Perligo, I found myself writing unit tests and becoming frustrated by how much it was slowing me down. The product hasn't shipped yet, and every minute I spend writing tests is a minute I'm not spending finishing features.

It took me longer to realize this than I would have liked, but unit tests and integrations tests, while good practice, are altogether useless for a pre-release product. Yes, it will be a little more difficult to add tests later, but no so difficult that I think you (or I) must write them while building an MVP.

Honestly curious if this is a shared opinion or an unpopular one.

  1. 13

    I'd say it highly depends on the product and your professional experience with test-driven development if it makes sense to write unit tests for an MVP.

    If it's a product with few interaction items and simple business logic then you're probably gonna be faster w/o writing tests. If there's complex business logic then writing unit tests is almost always faster than manual testing if you have a bit of experience.

    I'd say write unit tests where it makes sense. It's definitely not important to have amazing test coverage in your MVP. Find the code blocks that are important and have many different input values and unit-test those. Do the rest by hand.

    1. 5

      Agreed. Especially for side projects where you might not be in the code every day. Having tests cover the more nuanced parts of the code is important to ensure you don't accidentally break something.

      Personally, for a lot of unit testing, I find it faster to write the unit tests at the same time. If I am going to test it in the interactive prompt anyways, I might as well formalize it and put it in a test.

    2. 2

      Yep, I agree. No need to test everything. Not even the "important" parts. Unless the important parts are really complex and hard to test manually. Then you're actually saving time by writing those tests, even during the MVP phase.

    3. 2

      I like nuanced responses like this. Genuinely appreciate your answer. Thanks!

  2. 7

    TL;DR - I've spend more time wishing I had test coverage than time spent wishing I didn't have to write tests.

    I'm a fan of writing integration tests for the critical paths like signing up, login, billing, authorization, etc... Anything else is a nice to have unless it's complex business logic where good test coverage is a gift for future you and future developers.

    For an MVP, I'd still write the integration tests just for my own peace of mind.

    I've been on the other end of non-existant test coverage and it made it incredibly hard to know if I'd broken anything because I was new to the codebase. Imagine the amount of time needed trying to manually test an app you're not familiar with every time you want to deploy.

    I also have a short memory so it may even be my own code I come back to months later that I'm not sure if I'm going to break.

    And then when I find that I've broken something on production, it's a panic to deploy a fix and I may not get it first time because I have no test suite to run to confirm the fix actually works.

    So I definitely see the benefits of a good, reliable, fast to execute test suite. And the more you write tests, the easier and quicker it gets ;)

    1. 1

      This is a very solid take. I agree when working with a team, testing is imperative. I find it to be less necessary as a solo dev working on an MVP. But I 100% get the issue of forgetting your own code if it's been a while since you touched it.

  3. 3

    I find interesting the controversy here, for me is a no brainer (with the obvious exceptions).

    Where we are? Do not write tests.
    Where we are going? Do not write features, do not write, just tweet.

    There is so much more you can do before an MVP.
    I would say, document some basic pre-launch tests and run them manually (10 min).

    1. 2

      I think regression testing (manually) is really important regardless of the stage. I tend to run manual regression tests using a simple spreadsheet approach. Very much like I think you're suggesting for the pre-launch testing.

      1. 2

        Yeah, got what you mean about the spreadsheets.
        I got mine versioned in an .md doc that works well for simple tests.

  4. 2

    I've written tests for years, and I've discovered that there is a way to do it. For a MVP, I would follow this:

    1. Write tests for the critical moving parts. The parts of your application which MUST not break.
    2. If you don't write tests, make sure that you can test it easily afterward (be careful with coupling / cohesion).
    3. If you know you'll throw away your MVP to write your stuff in another language / framework, don't write any test.
    4. If you are sure you won't add many functionalities on top (you won't scale), don't test that much or not at all.
    5. When you write test, be sure it tests some behavior of your application. It doesn't make sense to test a controller which only call some other stuff in your app. Test the stuff, not the controller.
    6. Never use mock. If you do, don't test, and ask yourself if you could decouple your dependencies not to have to use a mock. Mocking means that you need (often) some external library to do so, which means more problems, and a possibility of high coupling between the different elements of your app.
    7. If you're app has mostly API calls, create quick fixtures and test against them. Don't call the APIs each time you test or you'll never run them (too long).

    Having an app where you can easily write unit tests means that you can scale your app, most of the time. It means that it has been somehow well designed, without too much coupling.

    The most important: don't make your tests dependent from each others. Make sure that if you change something minor in your app, you don't have to change all your tests as well.

    1. 1

      These are really great suggestions!

  5. 2

    I strive for an "MVT" approach, that is Minimum Viable Tests for an MVP. Tests for critical code paths and tests for specific methods or functions that you know are maybe a little tricky.

    Being an MVP, often, we rush through the code so having some barebones tests to just create some sort of sanity check helps me sleep better and has saved my ass plenty of times.

    I don't strive for any level of code coverage, nor do I worry about UI tests as it changes so often, mostly API endpoints and business logic, and for simple methods / classes I wouldn't bother at all in the early days. This is the approach i took with VuePilot initially and I was able to produce a solid product in good time.

    It really only takes that one time when a test you wrote months ago picks up a bug in some new code you just wrote for you to appreciate that they're there.

    1. 1

      MVT is a great approach. As many have said, it’s all a balance. I like monitoring systems to help me catch issues and then debug them but obviously having tests that can catch them before the monitoring system would be ideal.

  6. 2

    I hear you, I've wasted countless days testing my app heavily when it was still early. But I think there's a balance to be found.

    The way I fixed it is by mostly focusing on integration tests using cypress. Just to make sure that the main function of the app work basically. And then some unit tests for specific features that would be very hard to build without (like the code for the logo generator).

    But as soon as it becomes too much time "just to test it", like auth or checkout, I don't do it anymore. Wastes too much time.

    1. 1

      I've never used Cypress, but it looks awesome! I used Ghost Inspector at an old job, and this reminds me of that.

      1. 1

        Yes it works pretty good. Only drawback is it doesn’t work with sequences where there’s a domain change. So for example you can’t test a social login. But for most things it’s quick and works nice.

  7. 2

    So I suck at writing tests, always have. That's why I opted to not write tests for Jamform before I released it and concentrated on development instead. A week or two after releasing the initial version of Jamform things slowed down enough to where I decided I'd at least write tests for the "important" parts of my code, API route and Redux actions. In the first day, at only about 30% coverage, I had already found bugs that let free users use premium features, features that weren't usable by anyone, and some just very sloppily written code. It was because every time I added a new feature I didn't have time to manually go back and check that every single feature continued working under all use cases.
    That's why from here on out every project I work on I'll at least write tests for critical code to prevent regression, as well as it forces me to more thoroughly think about all possible interactions with that code and cover edge-cases.

    1. 1

      I feel this. I had this happen when I was first building Graphite. But even with issues like that, I don't regret not writing tests for Graphite right away. It was super important to get the product launched and in people's hands.

      That said, now, I am focusing on paying users right away, so I think a good middle ground is testing gatekeeping functionality even if it's an MVP.

  8. 2

    Hi Justin.

    I agree with your opinion.
    In my case, I work with a friend who is the person who work most in the programming part and what we do is to test it manually.
    Our app called Tweetline was tested manually and happily we do not have seen bugs in more than 4 months.

    Obviously it is like that in our case because we do not have enough resources but it works.
    Also, in more complex apps (we have only 2 screens) perhaps a unit test would be needed, but it is more $$ to spend and less time to write code and market the app

    1. 2

      The complexity of the app definitely matters. I was said here in the comments, but if there are complex functions even in a small app, I can see how testing would be super beneficial even in an MVP with a solo dev.

  9. 2

    I wouldn't say they're useless as they've documented the system, helping you to identify behaviours easier and if you're still trying to figure out what behaviours your users like and don't like this can help.

    I do agree that they are a time sink though, however it's ultimately like with all things in an MVP it's about the feedback you want to get.

    You don't want to find out one of the core concepts in your app isn't working correctly from your users as you'll spend more time fixing that then you may have if you had tests in the first place.

    If you just want reassurance that nothings regressed then a high level integration or system test can give you that safety net for the overall app while any core business logic can be more finely tested.

    Of course if there's no time to re-engineer the product between MVP and production then there may be a mountain of technical debt you'll have to climb if you want to get things ready for production.

    Something I've found building apps in React Native though is that if you don't use testing then look to compensate for it with a comprehensive monitoring and error reporting set up so you can track bugs down easier.

    1. 1

      I really like the idea of compensating early with comprehensive monitoring. It's a pretty good strategy to bridge the gap between MVP and the time when you need full testing (assuming you don't write a full test suite in your MVP).

  10. 2

    This is a pretty hot take but I agree.

    A robust testing infrastructure is for products that make money.

    You eventually want to get there, obviously, but don't put the cart in front of the horse. I think the eventual way you want to go with it is, once you're live with the MVP, for features and especially bugs, fix it and add some kind of minimal test to the project. You'll get there iteratively as your product grows. Yeah, there's technical debt, but that's likely not a major worry until it starts slowing you down.

    1. 1

      I actually think writing tests later is a good way to re-orient yourself with the codebase. So even though it might take longer later, it's could be a good thing even beyond just having tests now.

      1. 2

        My guiding principle is always getting it out in public as fast as possible. And then fixing it haha. But you gotta get the thing live no matter what.

  11. 1

    My favorite idea around testing is from Kent C. Dodds:

    Testing is about increasing your confidence that the app is working as intended.

    That doesn't say anything about integration tests or unit tests... it tells you to ask yourself what tests you need to write to feel comfortable that your app works and keeps working as intended.

    Plus, if the tests are well written they are by far the best documentation. You understand what your intentions were. You could have tests that allow you to write "bad" code while prototyping. Replace it later—your tests can act as a "feature specification".

    The code I feel good about and that keeps delivering for me have all had a comfortable amount of tests. Why is that?

    I think I have anxiety from being in projects with few tests, and where old code comes back to bite you in unexpected ways. And when I say old it could be last week's code that all of a sudden seems very unfamiliar.

    By writing tests I get to think through what it is I want to do. Simple mistakes are more easily avoided and I feel like I have a better understanding of all the moving parts. Oh, and next week I can read the tests if I can't figure out what I was thinking by reading the source code. Difference here between "what the code does" and "what was I thinking".

    I get why people don't like writing tests. It's hard to write good tests. "Good" implies some subjectivity. I mean tests that don't easily break and that are somewhat flexible in how the code is allowed to change. Too much rigidity and it feels like the tests are in my way.

    I'll end with a good testing quote from Guillermo Rauch:

    Write tests. Not too many. Mostly integration.

    1. 2

      That doesn't say anything about integration tests or unit tests... it tells you to ask yourself what tests you need to write to feel comfortable that your app works and keeps working as intended.

      I love this!

  12. 1

    I think you can get away with not writing unit tests and I agree to some of your points. But if I happen to work on mission-critical project where death can happen, it's important to write unit tests. Unit tests are also important for highly regulated products like finance, telecoms etc. In the end, I guess whether to write test or not to write tests still depends on the industry or consequences of your project.

  13. 1

    I completely disagree with the many of you that don't think that writing tests for an MVP is very important. I am honestly disappointed to see that so many thing that way.
    I have been writing proper tests for years and now I would not want to build anything without writing proper tests. Actually, I am seriously annoyed when there's something that is so complicated to test in an automated way that I am forced to avoid it. Whichever project I start, if it's something more than an experiment / throaway thing, I definitely write automated tests to cover as much as possible.

    Sure, writing tests when you start takes a little more time, but the more tests you write, the faster at it you become, to the point that it doesn't make much of a difference when you write the tests, but pays huge dividends as you build the thing further.

    I love the peace of mind of adding/changing things without breaking other stuff. I only test manually those things that are very difficult to test in an automated way (a random example is direct uploads to S3 with presigned URLs).

    Even for an MVP, I want to start working on it the proper way, and that includes tests.

    1. 1

      As you say, "proper tests". The problem is: many write tests which are not good. It's about building a MVP, not iterating on 10 projects to learn how to test correctly. I agree with you, I think tests are very important, but on the other side I prefer an app not tested than 103903429 tests all coupled together. The result is the same.

    2. 1

      I can imagine if you’ve been doing it forever, you wouldn’t want to change your process. And you shouldn’t. Thought I am genuinely curious if you have built products with a ton of testing in place and launched to crickets. If so, did you regret writing those tests at all? I assume not based on your response, but I’m definitely curious.

      1. 1

        No, I never regret writing tests because I always want to be sure I produce a good product. Maybe I prefer not building things if I don't see traction, but never untested stuff. :)

  14. 1

    Hi Justin, interesting opinion. As is so often the case, it depends!

    I realize Graphite Docs has been shutdown, but for posterity's sake I would suggest fixing the link to the site in your post. Currently it's https://grahptiedocs.com/ as opposed to https://graphitedocs.com/

    1. 1

      Gah!!! Markdown is awesome but it’s also the worst. Fixing now! Thanks!

  15. 1

    I understand the frustration you're expressing -- you're not making enough useful progress, and you've identified an obstruction to that progress. It sure would be nice if that obstruction just weren't there! You'd make so much more progress! And I confess that I have had this thought in the past, when writing unit tests. But I still write tests, and I no longer have that thought. The secret?

    I don't write unit tests anymore.

    Unit tests are frequently written at a level far too low to really be useful, solidifying internal architectures and requiring copious levels of mocking and injecting to write. Unless you are writing code that is especially well-understood, unlikely to change, and incredibly important, unit testing is generally overkill.

    Instead, considering writing integration tests instead. For example, I write tests targeting my API endpoints, and I write these before I write the API endpoints. This can actually speed up your development! Consider that you have to exercise the code you've written, in order to be sure that what you've written works. Without tests, you have to load up your changed app, manually navigate to the pertinent section of your app and fill in the forms/perform the action that will submit the request to your API, and then wait for the response. Worse, you have to have written the UI code that hits the API first, and if there's a bug in that code, you might not even notice a problem with your API code. If you've written integration tests for your API, you just... run the tests, and if they pass, you've written the code correctly. If you have to execute the code multiple times to test the function as you build it (which is quite likely if you're working on anything even vaguely complex, and tends to involve a lot more manual UI work if you're working on something complex as well), this can add a lot of extra development time.

    For Nodewood, I use Jest, which makes it very simply to gin up an instance of the app and send custom requests at it and examine the response. I also use MassiveJS for data mapping. A test for a request can look as simple as:

    describe('POST /customers', () => {
      it('should validate the request', async () => {
        const response = await agent
          .post('/api/customers')
          .send({
            name: 'Customer Name',
            email: '[email protected]',
            phone_number: '1112223333',
          })
          .expect(200)
    
        expect(response.body).toMatchSnapshot();
    
        const customer = await db.customers.findOne({ email: '[email protected]' });
        expect(customer).toMatchSnapshot();
      })
    })
    

    This ensures that the a new customer is saved correctly. It takes only a minute to write, and is easy to extend to add checks for new functionality that ought to happen when new customers are added (emails sent, 3rd-party accounts created, etc).

    Best, none of this tests the internal processes that make this all happen. I could change my database wrapper, I could rewrite my entire controller, I could change nearly anything about the internals of the application, but all this test cares about is that the same input gives the expected output to the response and in the DB. This gives you an excellent level of predictability and confidence in your code, makes it actually faster to write your APIs, and doesn't slow you down when you need to refactor how your code works internally.

    1. 2

      Honestly, mocking is one of the most frustrating things for me. I get it if you’re testing something super complex, but mocking results and data when testing your API seems odd. It’s almost like forcing the tests to be correct rather than testing the real thing.

      I like your approach a lot. Do you ever co sided using Postman or something similar for running your API testing?

      1. 1

        Postman's great if you just want to only test the API, but I also want to confirm that the correct info has been saved to the DB, that emails have been sent, that calls to S3 where made, etc. Using a more fully-featured testing library like Jest means I had set up those tests as well, to make sure the entire request is performed correctly, side-effects and all.

      2. 1

        You can use newman to run a Postman collection if you use that tool and want to set up a CI check for your API.

        We used this on a project in my day job, it allowed us to get a daily check up and running and maintain that suite in a quicker manner than writing code for it.

        Similarly Postman allows you to write tests for it's responses which means you can write entire user journeys via the API with little more than basic JS knowledge.

        I'm not sure I agree with your point about mocking though. By removing the uncertainty of what the data is doing you guarantee that if the tests fail then it's the system and not the data that's caused it and if you're not writing unit tests you want to make sure you don't lose time to tracking down bugs in your test data.

  16. 1

    It's a question of balance. I haven't found the right one for my project, but erred on the "too much quality" side, at the detriment of functionality.

    I have lots of confidence in my codebase, but unfortunately it slows us down way too much. I've blogged about this a few days back: https://medium.com/swlh/20-months-in-2k-hours-spent-and-200k-lost-a-story-about-resilience-and-the-sunk-cost-fallacy-69fd4f61ef59

    I also posted the link on HN where it stayed on the front page for a while. It brought tons of interesting advice too: https://news.ycombinator.com/item?id=25627081#25632696

    I agree with @svenfrese, it depends on the complexity of the project / business logic.

    1. 1

      Hey Sebastien! I read your post when it came out and I really loved the honesty of it. Balance is definitely the key. That's largely the theme here in the comments, it seems.

  17. 1

    I like to think of it as a graph where you have speed (y), and time (x). Now if you start without tests you will be high on speed (no pun intended ;)) and if you start with tests you will start low on speed.

    Now normally over time you will start loosing speed without tests and you will start gaining speed with tests along the way as you will hopefully have to debug less and less stuff and manually test everything, especially for regressions.

    Now I think you have a lot of parameters in how long you can go without tests and/or how fast you will really be depending on things like: Team size, Experience, Architecture, How much conviction you have you're working on the right things (Validation pre coding) etc.

    For example if my architecture is bad I will most likely have big problems with regression even early on so I will start to slow down faster. If I don't have much conviction and I'm going to pivot 5 times along the way, writing test will mostly be a waste of time and so on.

    Ideally you will try to ride the wave of high speed, no tests as long as necessary and fine the jump to the high speed with tests before the two intersect so like you said if you stay alone on the team push the tests out further but if you hire take some time to test the most critical things for example.

    1. 1

      I like this take. Visualizing this as a graph sort of solidifies my belief that my approach is the right one for me. Thanks for this.

      1. 2

        Yeah I'm with you on that one. I usually approach things this way too.

        I think what you have to watch out for is being honest to yourself and trying to really objectively assess where you stand.

        It's very easy to fall into the trap of pushing out tests for the sake of being "faster" when some work on the foundation would actually make you faster in the long run just because you're caught up in the moment and you fail to see the long term rewards so it's good to schedule some inflection points up front, like you could say:

        30 days, 90 days into the project or whenever you hire someone new, if you schedule those in advance it's more likely that you will actually take the time to sit down and think about it rather than being caught up in the day to day.

        At least that what helps me :)

        1. 1

          This is a great point. Planned time for reflection on what’s needed and giving yourself dedicated time to address those things makes sense. I think you nailed it when you said it’s easy to fall into the trap of being faster. Fast is good but only to a certain point. I’d argue post MVP, once you’ve proven users will use your product, fast matters far less and planning needs to replace fast.

          1. 2

            As professor oak likes to say: "There's a time and place for everything " :_)

  18. 1

    This comment was deleted 3 years ago.

Trending on Indie Hackers
How I grew a side project to 100k Unique Visitors in 7 days with 0 audience 47 comments Competing with Product Hunt: a month later 33 comments Why do you hate marketing? 27 comments $15k revenues in <4 months as a solopreneur 14 comments Use Your Product 13 comments How I Launched FrontendEase 13 comments