3
21 Comments

Are automated tests overkill for an MVP?

Hey everyone

The deal is as follows. I'm creating a very complex web application, the MVP for which is also quite complex. As I'm working through all the remaining tasks pre-launch I realize that as I go along, I accidentally introduce regressions. Now, I know that it's fairly normal to introduce regressions when initially creating an app, but the issue for me is that since there is so much functionality + different ways and combinations for people to potentially use the app, there could potentially be bad regressions that slip in without me even realizing it.

The way I see it, there are 2 ways to skin this cat.

  1. Write down a manual test plan that goes through many test scenarios, and before each release I can go through those scenarios and make sure the core functionality of the app is still working

  2. Create an automated test suite which I can run much more often and with less time invested in each run.

My debate, however, is that creating an automated test suite takes a significant amount of time invested up front whereas manual testing takes longer for each run, but is less time invested up front.

What would you do?

  1. 3

    Tests are not a bad investment if done correctly. Not only they automate they most hated part of coding but also helps you think and understand your design requirements.

    It's a bias error in my opinion when people say tests take a lot of time to write and expensive to modify when requirements change. I think people just make a lot of changes to codebase when they're confident that a good test suite will catch regressions. You'll likely limit the number of changes you make to your MVP significantly less out of the fear of breaking the app.

    MVP demands agility and stability. Customers will not trust you if you keep breaking things in the name of MVP. Good tests help you limit those failures.

    1. 1

      You bring up a good point regarding customer trust. I hadn't thought of it like that. I was more focused on getting my product out to the masses ASAP. Also, since the app is still in its infancy, there were some major rewrites that were required and since there weren't any tests it was quite painful.

  2. 2

    Unpopular opinion #2: Automated test for MVPs are a complete waste of time.

    I would take a step back and consider what is truly M of the MVP. Recall that a spreadsheet, signup form or landing page can serve as an MVP. You need customers to pay you - and if they pre-pay for the "Product", then you have acheived "Minimum Viable"

    Best of all is, when you think about the minimum and then put it out there, potential customers can either ignore it (typical) or want it yesterday (great). Don't worry about fast copycats or dev shops. They won't put in the energy to steal your customers and won't truly understand why your product is needed. Ideas are worthless without execution.

    Whenever you need a complete rewrite, it typically means something isn't what you expected - which of course leads to new automated tests ... so more wasted effort. If you build just the core however, it shouldn't need many rewrites and thus that would be perfect for simple sanity/smoke tests.

    And that of course would also be your MVP... So ship it already!
    </rant>

    1. 1

      Well I guess the missing information is what I'm actually creating which is a web-based video editor (but with a special twist which is what makes it unique). Since there are other competitors out there my thinking is the MVP needs to actually function as a video editor. Therefore the core functionality should remain pretty much and which leads me based on what you said to write automated tests.

      I totally agree that if this was a simpler project then automated tests would be a waste of time, but due to the complexity of the project, I think automated tests are the way to go at this point.

      1. 1

        yeah. no worries. best of luck.

        it's that old NEVER AGAIN, EVER, EVER!!! but maybe in this case :D

  3. 2

    Unpopular opinion: automated tests are almost always a waste of time. You might rewrite everything 3 times before you find product-market fit. Write some tests if your code quality is slipping as you scale or you're struggling to onboard new team members, but it's probably not worth doing it before that.

  4. 1

    There is a concept of the Test Pyramid. Bottom is Unit Tests (This should be the majority of your coverage), then there is integration (Between components, or services, or client-service) and then there is E2E (full system tests). All of this type of tests are automated (they run by command and programmable). Start from the bottom up. Start only with the Happy Path. Use better coding practices and patterns so you don't need to test implementations. This should reduce the amount of testing required to the minimum. Also, only test what is critical to your system, or what you believe will be more likely to fail. Then, just add more testing as more users (or you) find issues. last words: testing is expensive, but more expensive is not having it at all.

  5. 1

    I should have mentioned, maybe you should consider the tools... qawolf.com for example.

    What are you using now? postman.com for API testing at least?

    1. 1

      My thinking was to use Playwright for the testing. My APIs are quite simple at this stage and don't really require testing.

  6. 1

    I personally dislike testing in general, mostly because even if you have testing, you can still have bugs/problems in production and you usually test the use-cases that you predict, not the ones that will happen (especially for an MVP).

    I like to think more when writing code and write code in such way that things don't break, or if they break the source of the problem is pretty clear.

    Another thing about an MVP is that you might change core features or restructure the code/platform a lot, so tests will be a waste of time. Not only that, but an MVP can still provide value even if it's buggy, so tests don't add a lot of value. MVPs are also manually tested pretty often, so you will find bugs anyway.

    1. 1

      I guess it boils down to preference. The answers here seem to be split down the middle whether or not tests should be created for an MVP.

  7. 1

    I think there is a good argument for spending the time upfront to automate tests and then recover the time and thank yourself later.

    At a minimum, you could just automate tests for areas that you have identified as high risk and then fill in the gaps as you go. Not perfect, but it can be a good compromise.

  8. 1

    We found integration tests to have fairly immediate ROI and since they are based on base APIs they tend to be resilient to product churn. We were not able to make UI level automated tests work cheaply enough.

    1. 1

      I'd like to say that because the APIs are pretty resilient and don't change as much, they don't really need tests. What I need however is automated UI tests since that's the core functionality of the app.

      1. 2

        Most of the complexity of a video editor can be wrapped in APIs - and maybe those APIs start at the Javascript level. If you have integration testing of all of that complexity then you are just left with pure UI on top. That UI layer is the hardest to automate and the least likely to stand still long enough for testing. You might be able to get ROI at that level in a very mature product or if testing automation AI advances but for a newer product and our current level of technology we tried and failed.

        1. 1

          Well I guess it depends on what type of UI tests. I agree with you that if I'm making sure that the UI is in the correct place or the correct color or that a specific input exists in a specific location. The type of tests I'm thinking about are functional UI tests. Such as, does adding different asset types (audio/video/images) to the project still work? Does splitting media assets still work. Those types of functional changes are, in my opinion more resilient and less likely to change.

          1. 1

            No if you only test the functional and ignore UI glitches then someone still has to go look at everything anyway. That's the same as testing an API level but just much harder to implement by actually pressing buttons and drop downs etc.

            We tried exactly what you are saying and it didn't work.

            1. 1

              Maybe in some circumstances, but for my project at the moment, I'm not seeing any visual regressions, just functional.

              1. 1

                Again if its just functional its cheaper to test API level rather than come in from UI.

                1. 1

                  Okay, I think I understand you now. When you mentioned API, my brain automatically went to the server APIs which isn't what you meant. However I still think that the underlying functional APIs could change (performance optimizations, refactoring, etc.), but the end visual result should still be the same.

  9. 2

    This comment was deleted 3 years ago.

    1. 1

      I completely agree with that sentiment, I guess the difficult part for me is that creating the actual testing framework won't be as simple for me due to the requirements of the app.

Trending on Indie Hackers
How I grew a side project to 100k Unique Visitors in 7 days with 0 audience 49 comments Competing with Product Hunt: a month later 33 comments Why do you hate marketing? 29 comments My Top 20 Free Tools That I Use Everyday as an Indie Hacker 18 comments $15k revenues in <4 months as a solopreneur 14 comments Use Your Product 13 comments