Auto updated Test Environments with large databases

Zoran Antolovic (16.Mar.2017 at 18:00, 45 min)
Talk at ZgPHP meetup 2017/03 (#67) (English - US)

Rating: 5 of 5

If you ever had to maintain and setup test environments of your production app, you know how much pain in the neck it can be. Manual DB imports are heavy and automatic ones can easily be prone to security issues. In the talk I will describe approach we've been using for auto updated large databases on dynamic test environments for testing your web apps.

Test database that is decoupled from the production one and actually is a clone of it, can easily get outdated and further testing on it can make your development process harder. With simple approach we've designed, we can have fresh databases imported every night and UI for modifying what database our app is using for testing purposes. I will explain how we used different tools in order to split, process, purge, seed and prepare large DB dumps for import through the night - completely automatically so we don't have to worry about them at all

Who are you?

Claim talk

Talk claims have been moved to the new site.

Please login to the new site to claim your talk

Comments closed.


Rating: 5 of 5

18.Mar.2017 at 16:59 by Lovro (6 comments) via Web2 LIVE

Great insight into problem domain and one of the possible solutions. I like step by step approach the lecturer took and built solution in a few iterations. That turned out to be very useful for general understanding of this talk.

Outstanding, honest delivery and good discussion afterwards made this talk A+.

Rating: 5 of 5

20.Mar.2017 at 21:22 by Neven M (2 comments) via Web2 LIVE

This talk demonstrated a great approach at solving a DB related problem. The QA session after it gave some valuable solutions to the same problem. The lecturer presented the problem and his custom solution in a clear fashion.

Rating: 5 of 5

22.Mar.2017 at 21:04 by Tomo Omot (24 comments) via Web2 LIVE

Zoran gave a great talk, outlining the process he used to solve the problem he was facing - manipulating large data sets and using them to seed test environment for the service his team was building.

Clearly describing the background and presenting the audience with context, presenter set the scene for a more detailed walk-through of the processes he and his team needed to implement in order to overcome obstacles that were no laughing matter - from manipulating DB dumps several GBs in size originating from DBs they had not control over, processing them in order to preserve and protect sensitive private data, to handling and unbelievably optimizing import process of dump files into testing environment, cutting the time required from over 24 hours down to just a few.

The talk inspired a great discussion afterwards, with all of the participants offering their own views on the matter and giving various invaluable advice from their own similar experiences on how to best solve similar problems with respect to the scale of one Zoran presented.

© 2018