This is, incidentally, why it's somewhat of a misnomer to say film runs at 24 fps. True, movies are filmed at 24 fps, but when projected, each frame is actually shown 2 or 3 times in a row before advancing to the next frame - otherwise, people would notice the flickering.
Depends on how it's displayed. If it's a film projector, it's 24 FPS *period*.
Film projector shutters run at 48 or 72 Hz, because with a 24 Hz shutter the audience will notice flickering. Since the film "samples" at 24 FPS, this means that each frame is shown 2 or 3 times in sequence, just as I said. The image changes at 24 Hz, but it flickers on and off at 48 or 72 Hz.
This is also why the optical audio track on film is not in sync with the video (i.e. the audio for a frame is adjacent to the actual frame). The shutter mechanism requires the film advance and hold, while the audio system requires the film move continuously; thus the optical audio is offset from the video to allow for the necessary slack between the tho systems.
And the problem with LCDs and a low refresh rate is the exact opposite of flickering. LCDs have to use at least a 60hz refresh rate to reduce ghosting not because they would flicker.
I never mentioned LCDs.
That said, Ghosting is an artifact caused by the LCD having too slow a response time
, particularly relative to refresh rate, not merely a low refresh rate. In fact, with a given response time, increasing the frame rate actually makes ghosting MORE apparent - an LCD with a 10 ms response would look pretty good at 60 fps, but if you were able to run it at 120 or, god forbid, 240 fps it would ghost like crazy.
This is the main reason why LCD monitors didn't become widespread until response times has dropped below 20 ms (since with a 60 Hz frame rate, the image changes every 16.7 ms), and also why the biggest complaint about the original GameBoy was the blurry screen - the GameBoy refreshes the screen at around 60 Hz (exact value depending upon model, but generally in the range of 59 to 61 Hz), but the screen's response time was significantly longer than 17 ms.
Active-matrix LCD monitors do need to be refreshed periodically, but this has nothing to do with ghosting - TFT cells will gradually lose charge over time, much like DRAM. The required refresh rate for active-matrix LCDs is not particularly high, however, because TFT cells can retain charge for several minutes or longer - and this long charge retention time is why LCDs don't flicker, even a ludicrously low frame rates. (Passive-matrix LCDs are a slightly different story - they don't flicker because the LCD controller is directly supplying current to each cell.) Also, the backlight typically runs in excess of 100 Hz, so far above the flicker fusion threshold that it doesn't cause visible flicker. Except for those backlights that are always on, and thus by definition don't flikcer.
You are thinking of CRT displays which would flicker if the refresh rate was too low due to phosphor decay.
As an aside, this is actually the reason interlacing was invented - when broadcast standards were being finalized in the US, they had decided on a frame rate of 30 Hz, which is high enough that motion appears continuous, at a vertical resolution of 525 lines per frame. (30 frames/sec was also chosen because it was compatible with North America's 60 Hz AC electrical standard.) Unfortunately, the persistence time of CRT phosphors was so short and the eye so sensitive (TV CRT phosphors persist for only a few ms, and their images persist on the retina for about 10-20 ms after that) that the upper part of each frame has already faded from vision well before the frame was done, and thus there was noticeable flickering.
So they split each frame into two fields of 262.5 scanlines each, containing odd and even numbered scanlines respectively. This meant that the image of one field appeared before the previous field had faded from the retina, and thus there was no more apparent flicker.
Of course, even though motion appears smooth at 30 Hz, it appears even smoother at 60 Hz; and because a non-interlaced signal is slightly easier to generate than an interlaced one, video game consoles instead produced a non-interlaced 60 Hz image instead of an interlaced 30 Hz one. Because of this, the even-numbered scanlines on the CRT are never actually drawn (even though consoles from the SNES era on were capable of generating an interlaced signal, this was rarely used), there were thin black lines between each row of pixels, causing the stereotypical console "scanline" look.
Also, since going progressive scan meant dropping the extra 1/2 scanline per frame (262 lines/frame, as opposed to 262.5 lines/field), video game consoles actually has a screen refresh slightly higher than 60 Hz. (60.04 Hz, to be exact, compared to 59.94 Hz for regular color TV, which is slightly lower than the 60 Hz refresh of B&W TV, for other reasons I don't feel like going into.) Of course, the choice of 30 Hz frame rate for TV greatly complicated conversion of film for broadcast, since there are 1.25 TV frames for every frame of film. In Europe, the TV frame rate was instead 25 Hz (chosen because it was compatible with their 50 Hz electrical systems), so film conversion was, somewhat counterintutively, made simpler - in Europe they simply show the 24 FPS film at 25 FPS on TV, on the grounds that the 4% speed difference is hardly noticeable.