I don't think there would be any user menu adjustment for this. With not the most modern analog TVs you may find internal analog components that could be tweaked, but the most modern used a single crystal for all picture processing time base, so not possible to tune at all. You may have some luck with the modern USB TV receiver dongles, where the dongle just digitizes the IF and the rest is processed in the SW (all the demodulation, sync separation, color decoding, audio demodulation,...), some drivers allow quite extensive tweaking of all the parameters (as all the processing is in the SW anyway). Same I would expect for the digitizers (same thing: The HW is just sampling the signal at few MHz and the rest is in the SW). Maybe these digitizers would more likely have wider tweaking possibilities, mainly because the main use is just to recover analog records, which use to have various distortions. And for most frequent distortion types (tape uneven stretching,...) the SW will likely already contain some automatic corrections.
Classic CRT TV design uses true PLL to lock horizontally and "advancing by sync pulses" synchronization for the vertical. That usually leads to wider sync range for the vertical, but narrower for the horizontal. The reason is, the total accumulated noise causing the jitter of a TV broadcast (= signal generated with perfect timing, but significant noise) is proportional to the bandwidth. For vertical, just simple low pass (units of kHz) is able to reduce the noise so the sync is satisfactory even when nothing is visible on the screen anymore. With 15kHz horizontal, similar direct sync would need the cut out frequency to be set way higher (way above 150kHz), so lead to more noise in the signal, causing too much H jitter. The way around was the use of a PLL. That allows to generate the 15kHz H frequency, but with noise bandwidth even in few 10's Hz, if the oscillator phase noise would allow it, dictated by the PLL loop frequency response. But this same frequency response dictates how far off the signal could be from the free running oscillator to lock. Early TV's had just a single, fixed PLL filter, which had to be a compromise between the component tolerance robustness vs the noise bandwidth. So it usually require precise alignment, which becomes problem with long term stability. Even with that the noise bandwidth was rather wide. The later ICs allowed the complexity of the loop filter to be adaptive, so if PLL was not locked, it was set to faster response, allowing to cover greater component tolerance. Once locked, the response was switched to slow, greatly reducing the bandwidth, so the noise, so improved the picture quality and robustness against signal disturbance. This allowed very wide lock in range, so lousy so cheap component tolerances, while still offering good jitter performance. Or using a bit more accurate components and then allow really very low jitter. Many newest analog TVs used similar technique on the V signal as well as for color carrier, all in a form of a large picture processing chip (deflection pulse generation, color decoding), surrounded with just the power components and a single crystal as the main time base for everything.
Some chips contained compensation for time stretching distortion typical for VCRs, where the H and color carrier were tight together but overall allowed to be modulated across the frame. Or often just the slow response was disabled when switched to VCR display mode. The modification for the VCRs varied a lot among chipset makers, varying in what deviation they have tolerated. Usually it was compromise about picture quality with not that distorted record, vs how severe the distortion could be to still play at least somehow.
|