AA4PB has hit on an important point:
. . . Choosing _how to represent Morse code_ in the memory is an important design decision.
That's very different from PA0BLAH's assumption that the dits and dahs were stored individually, and that delimiters were needed.
Since it "reads" memory input from the paddle (if desired), it also must include a "paddle reader" function -- a primitive CW decoder!
That is not very different, but exactly what I wrote:
So save your messages in ASCII, and translate them according to this algorithm. An ASCII space 0x20 is the automatically implemented as 4 extra dit spaces above the already realised character space.
You need a look up table for translation ASCII in Morse code the way I described for A and Q with a delimiter. Table starts at
index 0x21 . One byte per entry. The prosign [HH] falls out, but is generally transmitted as [IMI] or I I. Furthermore you don't need [HH] in messages stored in the memory of a keyer. And the ASCII lower characters in the stored text , possibly present when you did not enter the text in memory with paddles, are first converted to upper case by substacting 0x20
And yes, when you want to enter text via your paddle's you need a translation Morse to ASCII in the keyer, that has the advantage that you can watch your transmitted text, watch the numbers 6 instead of B or even D transmitted, the 5 instead of the H, the : instead of the 8. And finally the enormous bunch of character spaces 0x20 you generated by too wide character spaces.
When you want to have an impression of your own sending, in the past there were papertapes, with a ruler you could measure the length of the elements.
Nowadays you can use a recording with your computer in the form of a wav file, look with Wave Studio (Soundblaster) or Audacy (a freeware audio editor) .You even don't need a ruler because the cursor is accompanied by the time in ms.
(The ISO international standardized abbreviation for millisecond is ms and not msec)