I got two PI zero boards, but have found little information about them.
Perhaps someone here can direct me to the answers.
Does any one know where schematics can be found?
On the boars are pads marked "TV" and "RUN" on the front. What are these
On the back are pads marked J5. What are those?
No - Even if there were, there's nothing you can do with it - not unless
you're an employee of Broadcom with the right equipment and data sheets.
Just forget the JTAG.
It's a Raspberry Pi - more or less equivelent to the current A+
model without the audio and composite video connectors and associated
circuitry. It runs Linux (and other OSs) and has a bunch of user
controllable GPIO pins and ports.
The sad part of composite video is that it seems to be going away in the
consumer market (US). I think I have (maybe) one working monitor left
that will accept composite video, and automatically shows it in green.
Used to have an amber one, but it died years ago.
& it is available on a scart skt (do modern TV's still have these)
My Samsung TV has HDMI SCART Composite Component & VGA I do not think it
for smaller form factor may monitors designed for in car DVD players have
You will be called upon to help a friend in trouble.
Composite is a colour signal.
I thought the sad part of composite video was that it was always a bit
crap. But if you must use it then composite to HDMI converters are
readily available so you can view its fuzzy inaccurate colours on
your HD TV.
What about TV sets?
In Europe all new TV sets still seem to have composite input and
output, (on a horrible 21 pin connector known as a SCART)
Sets in Europe are still sold with analogue and digital tuners even
though, apart from a few areas in the former Soviet Union, there is no
longer any broadcast analogue TV.
Analog broadcast is still around in parts of the US too. Besides, I
need a way to connect my video tape player. At least if I get a new TV,
the DVD player won't have to connect *through* the video tape player.
The Pi has two JTAG ports. The ARM JTAG port provides a standard ARM core
debug TAP which means that debuggers like GDB and DS5 can breakpoint,
single-step, read registers, write memory, etc. Look at OpenOCD for
instance to drive it. The early Pis had JTAG debugging made awkward by
various stupid pin assignments, but I hope they've fixed that on the Zero.
The VideoCore JTAG is more problematic, though some minimal reverse
engineering has been achieved.
I'm unclear which port the JTAG on the Zero is. If it's the VideoCore then
you're broadly correct, if it's the ARM then there's plenty that can be done
Composite monochrome output was common for some considerable time before
microprocessor-based computers were capable of generating colour output,
though the colour of the display varied.
I always assumed that was due to the phosphor on your CRT: close
examination of the screen on the green-screen 12" CRT I use with my 6809
box doesn't show the pixellation that you normally see on a colour
screen, though - it looks like a single, even coat of phosphor.
Don Lancaster's "The Cheap Video Cookbook" told you how to build a a dirt
cheap memory mapped display. I went one better: my memory mapped display
used a 2K block of static RAM that was part of the MPU's address map and
included a 6845 CRTC chip, run 180 degrees out of clock phase with the
6809 CPU so both could run at full speed without interference. The 6845
scanned each line of RAM once for each monitor display line, passing the
address contents and scan line number to an EEPROM as an address and
reading out the dot pattern for than scan line of that character into a
shift register. The 6845 also generated the clocks needed to generate
scan lines from the shift register content as well as triggering the CRT's
line flyback and framing signals. This generated mono composite video: to
produce colour would have needed more RAM than the 2KB that was used and
was only big enough to deal with a monochrome 24 x 80 display. Colour
would have needed at least another 2KB to hold the colour attributes,
etc. of each byte.
That was a direct result of using a standard TV display: the resolution
was limited by the TV's bandwidth or, if it was a colour TV, the size of
the phosphor dots seriously limited the display resolution for text. Have
a close look (from 12-18") at any colour TV (CRT, not LCD/OLED/digital
screen) and you'll see what I mean.
A reasonable British monochrome 625 line CRT TV monitor did a pretty good
job of handling a 24x80 display, while a standard US TV one had only 525
scan lines and a considerably lower horizontal resolution and, as a
consequence, struggled to show even a 16x64 display. A lot of the early
US microprocessors used even lower horizontal resolution: IIRC 16x48
displays were quite common (Apple II with colour TV?) and some were as
low as 16x32 (KIM1 with Cheap Video Cookbook display?), though that may
have had more to do with available RAM rather than TV horizontal
martin@ | Martin Gregorie
gregorie. | Essex, UK
I think you over estimate the difference between the TVs and underrate
the difference in the standards. In the US much of the limitation was
bandwidth in the composite format. I added optoisolators to a TV to
avoid the composite input and got acceptable performance with 80 chars.
The real issue was they were squished together a bit too much
visually. 80 was at the TV's limit but it didn't have any trouble at
all displaying 64 chars.
To echo Rob's remark, composite is a colour signal, composite means a
composition of two or more component parts, these being the basic
luminance waveform with the chrominance and reference burst
Obvioulsly the vertical definition is lower with fewer scan lines, but
why should the horizontal definition be worse on a 525/60 set compared
It turns out that time taken to paint a scan line across the screen,
ie the active line period, excluding the blanking interval, is almost
identical for each system
625/50 = 52.0 microseconds
525/60 = 52.6 microseconds
On a mono baseband monitor intended to render TTL text there would be
nothing to gain by limiting the bandwith of the video signal.
This is an alnost exact description of the video on the Commodore PET.
They went to 2k and 80x25, with the last line used either as normal or
as a status line.
With a higher number of scan lines on proprietary monitors you could
have a lot better resolution. Commodore did that from the 8xxx onwards.
Back in the early Apple II# days, I was in Germany. Monitors were too
high in price to think about. As a result, I ended up modifying a Sony
AC/DC portable TV, adding a Apple compatible composite port, and
tweaking the video response to accommodate an 80 column card. That
worked fairly well, until I was back in the states, and ordered a
monitor. That was interesting, because the vendor shipped one with minor
problems. I complained, and they told me to keep it, and they shipped a
replacement that worked well. I ended up giving the extra one to another
apple user. For a time, I used both the monitor for text, and a small
color TV for graphics. You had to take the Apple to the TV store (KMart
at the time) to make sure that the TV you wanted to use would work
properly with the Apple. Not a few didn't do to well.
Back in the early 70's, there was a commercial version of Lancaster's?
Glass Teletype. We used it with some of the first controller based
electronic test systems bought from a now defunct company, Zentel.
The systems ran from Milar/paper tape, and had an odd memory scheme that
involved a ring of flip-flops. A counter kept track of where the desired
data was, and it was grabbed on the fly. Core and plated wire memory was
king, but horribly expensive.
The difference may have been in the shadow mask. Most (until the
Trinitron tubes were produced) used circular holes through which the
three electron beams were projected onto the phosphor. If you have
less vertical lines, you can use slightly larger holes, spaced more
widely. This reduces the resolution in both vertical and horizontal
I once modified a Trinitron set to provide direct input to the guns,
so that a BBC micro could produce a better colour display. This
bypassed the composite video limitations on bandwidth - the colour
subcarrier required a filter to restrict the monochrome bandwidth. The
difference in picture quality was striking.
Alan Adams, from Northamptonshire
IIRC NTSB used some sort of phase shift to encode the colour (with the
amount of rotation of the phase shift selecting the next colour). As this
alters the relative brightness of the colour guns you get a sort of
colour blur as the electronics track round colour sequence. In any case
the horizontal edge of a NTSB colour change is indefinite and of much
lower resolution that the vertical colour change resolution, which is
simply the scan line spacing. I suppose that if you grew up with NTSB
colour you'd not consciously notice this because 'thats just how TV is',
but if you've grown up with PAL colour encoding the lower colour
resolution of NTSB is very noticeable.
Economics limits the bandwidth. Back in the day[*] most people didn't use
VDU quality displays. When the first microcomputers (IMSAI, MTS, Altai,
SWTPC) appeared you used an ASR33 teletype or, if you had real money, a
second hand serial terminal to talk to it. A bit later microcomputers
grew ASCII keyboards with parallel connections and memory mapped displays
with composite video output. Now the standard hobbyist monitor was either
an unmodified TV set or what was effectively a TV set with the tuner
removed (this alone improved the resolution of a monochrome display). No
mass market TV manufacturer is going to use better components than they
need to because higher bandwidth tubes/transistors and more accurate
filters cost more,
[*] This was 1975/76. I was working in NYC at the time, in the 500s on
Madison, and so could wander over to The Computer Store on 5th, a few
blocks up from the Empire State Bldg, at lunch time and have a play. They
sold the brands I've listed. Commodore Pets, Trash-80s and Apple IIs were
all a year or three later.
martin@ | Martin Gregorie
gregorie. | Essex, UK
The American system is called NTSC (National Television Systems
Commitee, or never twice same color).
The analog colour television sends three signals, one black-and-white,
luminance, marked Y, and two colour difference signals.
The luminance channel runs on full bandwidth (PAL: 7 MHz) and the
difference channels have limited bandwidth (PAL: 1.1 MHz). This is
like giving a full resolution black and white image and wide colour
pens. The eye is so forgiving that the result goes for full value.
The difference channels are modulated as double sideband suppressed
carrier signals on the subcarrier (NTSC: 3.58 MHz, PAL: 4.43 MHz).
To keep the signals separate, the subcarriers are copies of the same
frequency, but at 90 degree phase difference.
What makes PAL less suspectible for phase errors is that one difference
signal is inverted on alternate scan lines, so the receiver can
compenasate by averaging the phases of two lines.