- For setting up Jasmine I've used following Jasmine generator for Yeoman
- For setting up Karma I've used following Karma generator for Yeoman
- For the setup of Mocha tests, there is also a Yeoman Mocha generator, but it has not been used here; in order to start writing Mocha tests, it is enough to clone the related directories in this repo.
- Copy the whole directory unit-tests/karma-jasmine_in_chrome.
- Get into the directory and execute
npm install
. - Load index.html in the browser and verify that all but one sample tests run successfully.
- Import your code and start writing your own tests.
- Copy the whole directory unit-tests/requirejs_es6_in-browser.
- Get into the directory and execute
npm install
. - Execute 'npm run compile_es6'; this is a sample watcher that will pause and run again every time you modify one of the relevant files.
- You may now check inside package.json the syntax of the watcher, the directories where it watches files, etc.
- Load Mocha_test_page.html in the browser and verify that the sample test runs successfully.
- Import your code and start writing your own tests. NB: every time you create or copy a new file into the setup you need to restart the watcher!!
- Copy the whole directory unit_tests/es6_in-server.
- Get into the directory and execute
npm install
. - Execute 'npm run run_es6_tests' and verify that the sample test runs successfully.
- You may now check inside package.json the syntax of the compiling command, the directories where it reads/writes files, etc.
- Import your code and start writing your own tests.
- Run the unit test example with coverage:
npm install;npm test
- The Istanbul code coverage report will be generated in unit-tests/karma-jasmine_in_chrome/coverage folder
-
Example implementation of testcase with NightwatchJS can be found in the following folder e2e-tests/nightwatch
-
Example implementation of testcase with TestCafe can be found in the following folder e2e-tests/testcafe
-
Example implementation of testcase with Puppeteer can be found in the following folder e2e-tests/puppeteer
-
Example implementation of testcase with CodeceptJS can be found in the following folder e2e-tests/codecept
- Start the server to have a basic page to run the tests against it:
npm install;npm start
- Start the NightwatchJS end to end tests:
cd e2e-tests/nightwatch;npm install;npm test
- Start the TestCafe end to end tests:
cd e2e-tests/testcafe;npm install;npm test
- Start TestCafe end to end on a mobile device:
cd e2e-tests/testcafe;npm install;npm run test-mobile
- Start end to end tests with Puppeteer:
cd e2e-tests/puppeteer;npm install;npm test
- Start CodeceptJS end to end tests:
cd e2e-tests/codecept;npm install;npm run install:drivers;npm run start:selenium;npm test
- In order to invoke the tests on a specific browsers or on remote browsers have a look on the package.json
-
With visual validation you can create a baseline of images of your application or specific components / elements store them and during testing you generate another set. At the end you will compare the newly generated set with the baseline and you will have an image with the differences. This kind of testing can be use in order to spot the CSS issues as the other types of testing are focused more on functionality.
-
In the example that I've implemented together with the Puppeteer example I am using Resemble to do the visual difference between the generated images. The diff image can be found after running the test in the e2e-tests/puppeteer/output folder.
- In some of the test examples I've used a random test data generator called Faker.js