Even if you ask for a nanosecond (one billionth), I assume that you mean millisecond (thousandth) accuracy. DATEPART provides only slicers for millisecond precision.
Using an expression, the bits to build the format string will look like this:
Right("0" + (DT_STR,2,1252) DatePart("hh",getdate()),2) + Right("0" + (DT_STR,2,1252) DatePart("mi",getdate()),2) + Right("0" + (DT_STR,4,1252) DatePart("ss",getdate()),2) + Right("000" + (DT_STR,3,1252) datepart("Ms", getdate()),3) + (DT_STR,4,1252) datepart("yyyy", getdate()) + (DT_STR,2,1252) datepart("mm", getdate()) + (DT_STR,2,1252) datepart("dd", getdate())
I don't know how getdate works internally, but I found this question in a study. What is the best way to measure how long the code takes to execute? but I suppose he basically calls DateTime.Now Cash quotes from Eric Lippert there, but this one was the most appropriate.
Note that the βwall clock timeβ measured by DateTime is only accurate, something like 30ms. DateTime is designed to represent things like a clock on a wall or the time a file was last edited; he does not need to have nanosecond accuracy and therefore it does not
If you must move to nanosecond accuracy, a happy hunt, but expression will not reduce it. As pointed out by @fegemo, a script task can lead you to one ten million custom formatting , but that's two more orders of magnitude your desired accuracy.
this.Dts.Variables["User::CustomFormat"].Value = DateTime.Now.ToString("HHmmssfffffyyyyMMdd");