20. How to¶
20.1. Run security checks for dependencies¶
You can run the crontabber job that checks for security vulnerabilities locally:
20.2. Connect to PostgreSQL Database¶
The local development environment’s PostgreSQL database exposes itself on a non-standard port when run with docker-compose. You can connect to it with the client of your choice using the following connection settings:
20.3. Reprocess crashes¶
20.3.1. Reprocessing individual crashes¶
If you have appropriate permissions, you can reprocess an individual crash by viewing the crash report on the Crash Stats site, clicking on the “Reprocess” tab, and clicking on the “Reprocess this crash” button.
20.3.2. Reprocessing lots of crashes if you’re not an admin¶
If you need to reprocess a lot of crashes, please write up a bug. In the bug description, include a Super Search url with the crashes you want reprocessed.
20.3.3. Reprocessing crashes if you’re an admin¶
If you’re an admin, you can create an API token with the “Reprocess Crashes”
permission. You can use this token in conjunction with the
scripts/reprocess.py script to set crashes up for reprocessing.
For example, this reprocesses a single crash:
$ docker-compose run processor bash app@processor:app$ ./scripts/reprocess.py c2815fd1-e87b-45e9-9630-765060180110
This reprocesses crashes all crashes with a specified signature:
$ docker-compose run processor bash app@processor:app$ ./scripts/fetch_crashids.py --signature="some | signature" | ./scripts/reprocess.py
If you’re reprocessing more than 10,000 crashes, make sure to add a sleep
argument of 10 seconds (
--sleep 10). This will slow down adding items to
the reprocessing queue such that the rate of crashes being added is roughly
the rate of crashes being processed. Otherwise, you’ll exceed our alert
triggers for queue sizes and it’ll page people.
August 17th, 2017: Everything below this point is outdated.
20.4. Populate PostgreSQL Database¶
Load Socorro schema plus test products:
socorro setupdb --database_name=breakpad --createdb
20.5. Create partitioned tables¶
Normally this is handled automatically by the cronjob scheduler Service: Crontabber but can be run as a one-off:
python socorro/cron/crontabber_app.py --job=weekly-reports-partitions --force
20.6. Populate Elasticsearch database¶
See the chapter about Crash storage: Elasticsearch for more information.
Once you have populated your PostgreSQL database with “fake data”, you can migrate that data into Elasticsearch:
20.7. Sync Django database¶
Django needs to write its ORM tables:
export SECRET_KEY="..." cd webapp-django ./manage.py migrate auth ./manage.py migrate