Pi power

I'm not sure about the 4, because I don't own one, but this is true for at least all Pi's before the 4. My shiny new Pi400 does however have the capability of a soft power off. It even has a special key-combo for it.

This is however completely irrelevant to this thread, because the OP had his PI still being powered after physically powering the powersupply off.

--
[J|O|R]
Reply to
Oscar
Loading thread data ...

Power, but only 55mA

The "seeing" is over the DDC pin

Reply to
Andy Burns

Not enough to run a Pi, then, I imagine, but enuough to flutter the Pi's red/green LEDs.

--
Tim
Reply to
TimS

The red LED just shows that power is on - nothing more.

The yellow and green ones show whether the Pi is active: stop a running Pi with the 'sudo stop' command and all the LEDs go out apart from the red one, which stays on until you switch off or unplug the wall rat.

That you don't know that shows that you usually stop your Pis by powering them off. Not a good idea because it can corrupt the SD card - and Murphy says that you WILL soon or later power it off while the SD card is in the middle of wear balancing - this will cause unrecoverable errors.

--
--   
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

Thanks - that's useful to know.

No "usually" about it. This is my first Pi and it was unboxed approx 24 hrs ago.

Roger that, Houston.

--
Tim
Reply to
TimS

It's exactly the same with VGA.

The purpose of this is for the computer to read the EDID ROM from the monitor. It supplies 5V to allow detection of the connected monitor even if it's turned off.

The monitor/KVM should not be emitting power on this rail.

Although it wouldn't have taken much for the Pi to incorporate a diode to prevent back-powered. Maybe there are use cases where people want to power their Pis from HDMI, I don't know.

Theo

Reply to
Theo

Yes, but you can't get an HDMI interface that doesn't implement HDCP can you? That's the whole point, it prevents (supposedly) you viewing things you shouldn't with your TV (or whatever) that has an HDMI connector.

--
Chris Green
Reply to
Chris Green

Doing that would make the installation tidier if I wanted to glue the Pi enclosure to the back of the screen - provided that the screen can provide the wattage needed. Is that likely, bearing in mind that, if I did that, I'd also want it to power a keyboard and mouse via the USB sockets on the Pi.

--
--   
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

HDCP is optional - both sources and sinks can elect not to have it (the Pi has HDCP support in hardware but doesn't enable it, for example). Obviously if you're a monitor and don't implement it, you won't work with HDCP sources.

Many cheap HDMI splitters strip off HDCP (there's no way to pass through HDCP communication with 3 concurrent sinks and it's cheaper just to strip it) so it can easily be bypassed.

Theo

Reply to
Theo

HDMI will never power a Pi, way too thirsty.

Similar devices such as an Amazon Firestick have to have an external PSU, as they require more power than even the TV's USB port can supply. Note that if this does seem to work, you are probably overstressing your TVs electronics, which will lead to premature failure.

---druck

Reply to
druck

I shutdown the Pi and when it had, turned off its power at the switch on the socket strip. This caused the green-LED fluttering I mentioned before. I then disconnected the HDMI cable from my Mac (the cable that goes to the other input of the KVM unit). The green and red LEDs on the Pi then went out.

So it's my Mini that's doing this, but that is perhaps not surprising given what I've learnt in this thread.

--
Tim
Reply to
TimS

For filming, but it was never displayed at that rate. In 50Hz countries, telecines and cinema projectors had a double shutter to display each frame twice (though no interpolation) and NTSC telecines repeated alternately 2/3 times to get it up to 60. Not sure if US cinemas did that, or put up with 48Hz. Modern telecines can interpolate for smoother motion, and play at any film speed.

--
Joe
Reply to
Joe

Makes sense in light of the content of earlier posts: if the HDMI cable only carries power so it can wake up some HDMI chip in the, possibly powered-off, device at the far end which tells the initiator what it is talking to and what that can or cannot do, then it follows that doing this only needs a few milliamps. I thought that might be the case: thanks for confirming it.

--
--   
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

They actually played it 4% faster at 25fps and then doubled it to 50fps.

Which gives the correct speed, but terrible tearing of movement when different film frames are used for each field of an interlaced TV frame. Luckily not something you see any more with progressive formats.

---druck

Reply to
druck

Possibly, but probably not. Thomas Edison believed 48 FPS was minimum quality projection. Is that during your life? Film was recorded at 24 FPS, yes, but projected so that each frame was shown two or three times, 48 to 72 FPS.

At around 16 FPS the motion stops looking jerky, but there's still a visible flicker and some eyestrain. Faster playback rates (with duplicated frames) removes the flicker, but depending on the duplication method can introduce other artifacts. 24FPS source shown on a television, for example.

Televisions used AC frequency as a reference, so preferred 50 FPS in

50Hz parts of the world and 60 FPS in 60 Hz areas for projection and half that for recording. Stuff recorded at other frame rates get extra frame duplication to keep pace. 25FPS source ABCDEFGHIJKLMNOPQRSTUVWXY yeilds by simple doubling: 50FPS display AABBCCDDEEFFGGHHIIJJKKLLMMNNOOPPQQRRSSTTUUVVWWXXYY 24FPS source ABCDEFGHIJKLMNOPQRSTUVWX yeilds by mostly doubling but also padding: 50FPS display AAAABBCCDDEEFFGGHHIIJJKKLLMMNNOOPPQQRRSSTTUUVVWWXX

Theoretically you could do something like: AAABBCCDDEEFFGGHHIIJJKKLLMMMNNOOPPQQRRSSTTUUVVWWXX but I think TV signal pipeline is designed around 25 FPS input.

24FPS to 60FPS works similarly, but has more doubled doubles.

As technology has improved, so have standards, though. Artifacts of a slower displays were previously tolerated, but now a effort is made to avoid those problems. Now you can more easily accomodate that AAABBCCDDEEFFGGHHIIJJKKLLMMMNNOOPPQQRRSSTTUUVVWWXX version, for example.

Totally as an aside:

One interesting trick I've seen done is mixing "frame" rates to make things look jerky. In the _Lego Movie_ the camera and backgrounds update at 24FPS, but the characters in the foreground update at 12FPS. It gives the film a much more hand animated look.

I've got a slowed down sample from a since removed Youtube clip that I made in 2014:

https://qaz.wtf/C/lego-sample.gif

Note that the character spinning in the background on the left animates every frame, while the walking hero front and center animates every other frame.

Elijah

------ severely compressed the colors in the gif to get the file size down

Reply to
Eli the Bearded

No, but 50Hz interlaced monitors (stripped TVs really) are and removing the interlace was considered enough to make it flicker free and "professional" circa 1980 - I sometimes turned the interlace back on and pushed the resolution up. In more recent times 60Hz was said to be needed for flicker free and now it's 70Hz I'm seeing and high end TVs are currently advertising 120Hz.

Weren't domestic cine projectors 24 fps and considered good enough for the use ?

--
Steve O'Hara-Smith                          |   Directable Mirror Arrays 
C:\>WIN                                     | A better way to focus the sun 
The computer obeys and wins.                |    licences available see 
You lose and Bill collects.                 |    http://www.sohara.org/
Reply to
Ahem A Rivet's Shot

ently

The way I remember it is, that one of the reason for white on black was the lower sensitivity of the eye for flicker. The first monitors with a white background were hailed as a huge improvement but they all -- both highly expesive ISA card monitor combos and the Atari -- needed 70 Hz to be acceptable. 60 Hz was considered borderline for colour when white backgrounds were avoided.

--




/ \  Mail | -- No unannounced, large, binary attachments, please! --
Reply to
Axel Berger

16fps IIRC?
--
Ideas are more powerful than guns. We would not let our enemies have  
guns, why should we let them have ideas? 

Josef Stalin
Reply to
The Natural Philosopher

It was interestingly variable from person to person. Some of my colleagues worked happily on monitors set at 60 Hz, but anything less than 70, preferably 75 Hz, made me feel sick.

John

Reply to
John Aldridge

Several different issues there.

Old CRT TVs worked with an interlaced signal so needed long persistence phosphors so the each field of alternating scan lines would still be visible when the next field was displayed. UK PAL TVs were set up for

50Hz fields so had a longer persistence than US NTSC at 60Hz.

CRT monitors (generally) worked with non interlaced signals, so had low persistence phosphors as all the scan lines would be updated every frame. Monitors were invariably designed for the US market so had phosphor persistence for a minimum of 60Hz frame rate, and would look terrible if driven at 50Hz.

As monitor resolutions and sizes increased the frame rates also increased to 70Hz or 75Hz, to reduce flicker and smearing of screen updates.

Now with LCD, LCD/LED and OLED technologies the screen does not flicker at the update rate. Panels used to have a 60Hz update rate to match US content (but would be fine for 50Hz), but active 3D drove the adoption of 120Hz or greater panels, so alternate views could be shown at 60Hz.

3D has largely been dropped now, but 120Hz or greater panels still exist, so they can either be driven at a higher frame rate by games consoles, or additional intermediate frames are generated for lower refresh rate content, in both cases to make animation smoother.

---druck

Reply to
druck

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.