./mach test-perf can be used to run a performance test on your servo build. The test result JSON will be saved to etc/ci/performance/output/. You can then run python test_differ.py to compare these two test results. Run python test_differ.py -h for instructions.
TREEHERDER_CLIENT_ID and TREEHERDER_CLIENT_SECRET./mach test-perf --submit to run and submit the result to Perfherder.PATH (e.g. for Linux export PATH=$PATH:path/to/geckodriver)export FIREFOX_BIN=/path/to/firefoxpip install seleniumpython gecko_driver.py to testtest_all.sh --gecko --submit (omit --submit if you don't want to submit to perfherder)python test_differ.py output/perf-<before time>.json output/perf-<after time>.jsonpage_load_test/example/example_async.html for example.page_load_test/ folder. For example we can create a page_load_test/example/example.htmlpage_load_test/example.manifest# Pages got served on a local server at localhost:8000 # Test case without any flag is a sync test http://localhost:8000/page_load_test/example/example_sync.html # Async test must start with a `async` flag async http://localhost:8000/page_load_test/example/example.html
MANIFEST=... link in test_all.sh and point that to the new manifest file.You can run all unit tests (include 3rd-party libraries) with python -m pytest.
Individual test can be run by python -m pytest <filename>:
test_runner.pytest_submit_to_perfherder.pyRunning the same performance test results in a lot of variance, caused by the OS the test is running on. Experimentally, the things which seem to tame randomness the most are a) disbling CPU frequency changes, b) increasing the priority of the tests, c) running one one CPU core, d) loading files directly rather than via localhost http, and e) serving files from memory rather than from disk.
First run the performance tests normally (this downloads the test suite):
./mach test-perf
Disable CPU frequency changes, e.g. on Linux:
sudo cpupower frequency-set --min 3.5GHz --max 3.5GHz
Copy the test files to a tmpfs file, such as /run/user/, for example if your uid is 1000:
cp -r etc/ci/performance /run/user/1000
Then run the test suite on one core, at high priority, using a file:// base URL:
sudo nice --20 chrt -r 99 sudo -u *userid* taskset 1 ./mach test-perf --base file:///run/user/1000/performance/
These fixes seem to take down the variance for most tests to under 5% for individual tests, and under 0.5% total.
(IRC logs: 2017-11-09 | 2017-11-10 )
If you want to test the data submission code in submit_to_perfherder.py without getting a credential for the production server, you can setup a local treeherder VM. If you don't need to test submit_to_perfherder.py, you can skip this step.
192.168.33.10 local.treeherder.mozilla.org to /etc/hostsgit clone https://github.com/mozilla/treeherder; cd treeherdervagrant upvagrant ssh./bin/run_gunicornhttp://local.treeherder.mozilla.org and login to create an accountvagrant ssh./manage.py create_credentials <username> <email> "description", the email has to match your logged in user. Remember to log-in through the Web UI once before you run this.TREEHERDER_CLIENT_ID and TREEHERDER_CLIENT_SECRET