In computer terms, the 'clock' is a timer which the core uses to regulate its frequency. It's common to hear someone say that a computer is 'clocked' to 4ghz or is 'overclocked'. These cheap chips can be bought for less than a dollar, and their timer oscillator is usually just a cheap RC circuit of some sort and loosely works as expected. To make them more accurate, you need to incorporate an external crystal or realtime clock, which is just big and overkill for this application. I was just making the user aware.
If the clock is 32khz, and drifts by 100hz, it may be off by a few seconds per month. Maybe better, maybe worse. I have seen them drift by a minute in a month or two.
You also need to be aware of overflows if counting a timer for a long time, and handle it as needed. That too is a lot easier to deal with when using a realtime clock to handle your events.
I made a timer, waterer, and air monitor device, connected to wifi, for a greenhouse and needed to incorporate an external realtime clock to make it perform to my expectations longterm. It's really a matter of how picky you are.
I don't know the user's application, but I imagine they might be annoyed if their lights are off by half an hour a year or two later, or 'randomly' stops working predictably every X number of days when their mishandled timer variable overflows.