If you are dealing with a rather trivial pattern, where the letters are based only on the previous one, then you will notice that the Hidden Markov Model (HMM) will solve it - in fact, something simple how the Markov chain will work.
If you want to have some fun, here's an HMM-based solution that you can play with.
Go to the sample data and create a linked list of each item in the order in which they were inserted. Now create a different list for each other character and place the index of each list item where it belongs. Here's a (very poorly drawn) visual representation of the linked list and the bucket below it:

Now that you are presented with a sequence and asked to predict the next character, you only need to look at the last X-characters and see how subsequences similar to it work.
To use my example above, look at the last (last) 3 characters to get a BAC . You want to see if the BAC sequence has ever been before, and what happened after it when it happened. If you check the bucket for the first letter of BAC ( B ), you will see that the letter B appeared earlier. Fortunately, he follows the sequence - and after him appeared A , so that would be a prediction.
You can not only check the sequences of the past X, but also each number below X, giving each less weight if the sequence matches to create a better heuristic.
The hard part is deciding how far to look - if you look too far, it will take too much time and you will not be able to get any matches. If you look too short, you can skip the pattern and guess.
Good luck - I hope itโs nice and easy to implement and works for you.
Addison
source share