Hello! [Log in]
CollEntRes C++ library for high performance websites

FAQ

Recompilation time for projects with a large number of files takes too much time, doesn't it?

The structure of CollEntRes C++ Framework minimizes the need in recompilation of large number of files when a developer modifies some files in a large project.

Let's assume we have a project that uses CollEntRes C++ Framework and has approx. 1000 CPP files, which include controller actions and views (approx. 500 endpoints). You run project compilation from scratch, it finishes. Then you modify a file that belongs to some page, e.g. a file from some subfolder of "pages" folder of the project root (it doesn't matter whether it is a controller's file or a view) and you run an executable. In this case, recompilation will be very fast (max. a few seconds), because the CPP files of a single page will be recompiled only.

And then you modify some C++ header that is included in 80% of project CPP files (this file can contain e.g. Req class or UserEntity class that is needed in almost every page) and you run an executable. In this case, 80% of project files will be recompiled and it could take 12 min if you are using a computer that does not have a very good performance. But a number of such files, which are included in the most of project files, is small and their content can be written at the beginning of the development.

Also you can use e.g. Zapcc C++ compiler that performs recompilation incredibly fast and recompilation takes max. 2 min instead of 12 min (x6 faster). Zapcc C++ compiler on MacBook Pro 2012 with a HDD recompiles those 80% of project files for less than 1 min that is acceptable assuming that large recompilation occurs very rarely.

C++ does not have a garbage collector. Will there be a problem with it on pages with a lot of data?

CollEntRes C++ Framework has a way of getting data (collections, entities) from several related DB tables using "with" member function (see the source code of Simple Website project).

E.g. you get an article collection that contains article entity objects, then you use "with" member function and get article data, image, creator, tags and other related data using single function call, then you use the article collection to get page HTML markup and you need to call the single "deleteWithEntities" member function on that collection (or to use "delete" operator on an entity with related data) in order to deallocate all the memory that was allocated during getting the collection and its related data. So you see that this is not a big deal. The source code of Simple Website will convince you that it is not a big deal. Just look how easy entities or collections are obtained and deleted there.

Of course, a developer can forget to call "deleteWithEntities" member function on some collection or use "delete" operator on some entity somewhere in code. This will lead to memory leak that will cause a web application crash. But you can detect such places where memory was not deallocated properly using special memory test tools at the end of project development. If you are using Linux, then valgrind will help to do this very easily.

What is the recommended operating system for development?

You can use any operating system that is supported by CollEntRes C++ library. But I would recommend to use Ubuntu or other Linux x64 OS. Such tools as Zapcc C++ compiler and valgrind are available for Linux users and they will be very useful for C++ development.