Forest fire prediction
and analytics app

Development and maintenance of applications based on distributed services. Migration of parts of the platform to the AWS cloud. Preparing to migrate payments to Stripe.

Google CloudRunner
Google Cloud Function

To predict the effects of forest fires in the US and Canada.


Determining the scale of the risk of a natural disaster and the need to take preventive actions.


Using artificial intelligence and a computing platform to create reports based on the analysis of state documents.


A non-profit data collection and analytics application that has the aim of predicting the effects of forest fires in the US and Canada. The application generates advanced reports using artificial intelligence to analyze extensive state documents and a computing platform with a postGIS plug-in for interpretation and saving geographic data directly to the database. Based on the generated reports, the scale of risk resulting from the occurrence of a natural disaster is determined and preventive measures are then taken.

  • Scope

    Frontend & Backend Development, Architecture, Cloud Infrastructure, Databases, QA

  • Industry

    Environmental Protection and Crisis Management, Information Technology (IT), Data AnalysisResearch and Development (R&D)

  • Region / Country:

    USA/ Canada


Every year, as a result of self-ignition or irresponsible human action, hectares of forests on the border of Canada and the United States burn. The losses incurred because of natural disasters are documented by the state authorities and stored in statistical data sets, as well as on specially created maps of the area. Thanks to complex calculations, it is possible to predict the effects of the threat in the future e.g., to estimate the number of species that may die in a given area in a possible disaster.


The ranks of the main creators of the application were joined by our development team, which included two tasks.

The first was generating advanced reports. Based on current and historical statistical data and terrain maps, detailed data was extracted for each disaster case.

To make it possible and quick to implement, the monolithic architecture was broken down into a microservice pattern using functions. For this purpose, PostgreSQL was combined with Node.js, which fulfilled a role as an "observer", and moved to the Google Cloud, additionally using a serverless approach.

The second was the development of an application for collecting data from forms and generating the final reports in pdf. To ensure the correctness of the data entered in the form, the team created a mechanism to check and improve the boundaries of the project. The maps were presented in GeoJSON form, so that the calculations were quick and error-free. What's more, a mechanism was made to generate e-mails for tasks assigned to an official and reminders about the completion of a task when it has been unfinished for a long time. 

To be able to determine whether a function that counts the sources of possible disasters produces correct results, the testing method had to be changed. Previously, it was checked whether the function counts maps, but with possible errors, the process of functional and integration tests did not stop CI and it was always required to manually check the calculations. Thanks to the introduction of unit, functional and integration tests with assertions, it was checked whether the result is also correct with the expected value in the perspective of individual functional elements and the entire system.

Project result
Project results

Reducing the working time of state officials from half a year to half an hour when preparing reports from available statistical data thanks to the development and application of algorithms.

A 40% reduction in costs and acceleration of application performance thanks to the creation of a serverless application that performs calculations on maps in the Google CloudFunctions environment.

Increasing the quality, detail and speed of tests carried out by changing the testing method – unit, functional and integration tests with assertions.  Also, by reducing costs on GitHub Actions, because after the change, it was not necessary to run four virtual machine units – just one was sufficient.