I am trying to create a data model that can contain a very large amount of data, anyone who has experience in large volumes of data has feedback on this, that is:
public class TransactionAccount {
private long balance;
private List<Transaction> transactions = new ArrayList<Transaction>();
....
public long getBalance() { return balance; }
}
private class Transaction {
public Date date;
public long amount;
}
Based on the read, the only way to get transaction integrity when inserting Transactionand updating balanceis to make it one entity group.
However, over time there will be millions of transactions for a certain one TransactionAccount. The number of records in this group of objects will be low, but reading will be much higher.
I know that it could be plastered, but reading balanceis a very frequent operation, and scalding would make one of the most common operations getBalance()the slowest operation.