I know that the question is 4 years old, but the accepted answer does not make sense (as Justin Raymond pointed out).
Nick Babcock's approach is fuzzy, because the number of elements is too low; there is always some overhead on the heap that you will also measure.
To show this, I used a larger data type and other elements (4096): On g++ 6.2.1 and linux x64 sizeof(void*) = 8 and sizeof (bigDataType_t) = 800 ( bigData_t is long[100] ).
So what do we expect? Each type of list must store the actual data on the heap; std::list stores 2 pointers to a link (back and forth), std::forward_list only one (forward).
Expected memory for std::list : 4096 x 800 + 2 x 8 x 4096 = 3,342,336 bytes
Actual memory for std::list : 3,415,040 bytes
Expected memory for std::forward_list : 4096 x 800 + 1 x 8 x 4096 = 3,309,568 bytes
Actual memory for std::forward_list : 3,382,272 bytes
I used Massif to get use of a bunch of programs.
As we can see, the numbers fit pretty well. When using large data types, the memory for the extra pointer does not matter much!
When using char as a data type (like OP), the expected and actual amount of memory does not fit too well, most likely due to some overhead. However, there is no factor 3 for memory consumption.
std::list: Expected 69,632 bytes, actual: 171,008 bytes
std::forward_list: Expected 36,864 bytes, actual: 138,240 bytes
My code is:
#include <list> #include <forward_list> struct bigData_t { long foo[100]; }; typedef bigData_t myType_t; // typedef char myType_t; int main() { #ifdef USE_FORWARD_LIST std::forward_list<myType_t> linkedList; #else std::list<myType_t> linkedList; #endif for (int i = 0; i < 4096; i++) { myType_t bigData; linkedList.push_front(bigData); } return 0; }