Maybe a third of the ICs made in the world are exposed using my laser controller.
The cost of developing one new 7 nm class chip is approaching a billion dollars, and the EUV industry is a decade behind schedule and so far not profitable. The physics and chemistry are horrible. There aren't many chips that are worth a billion dollars of NRE. Memory, mostly.
Cell phones are replacing computers. Small uPs are wire bond dominated in area. We just don't need to keep pushing Moore's Law.
--
John Larkin Highland Technology, Inc
lunatic fringe electronics
If you haven't got experimental evidence, it's easy to bark up the wrong tree (as you point out from time to time, though you don't seem to understand that observational sciences also accumulate experimental evidence).
Einstein died in 1955, and we've done a lot of experiments and made a lot of new observations since then.
Second-guessing how Einstein would have reacted to that evidence is foolish, and Cursitor Doom thinking that he could was even more foolish.
Has it occurred to you there isn't one to publish? They didn't really take the image with a camera in any literal sense. It is very possible there s imply was never an image created that didn't include the processing. Would n't be much point, would there?
--
Rick C.
- Get a 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
The only reason why the chips are wire bond size limited is because they ar en't integrated adequately. There are too many other chips on the boards. This is because memory and other logic are better made separately. Once p rocessing limits are finally pushed they will have no reason to not integra te everything on one chip. Then finally the phone will become a commodity with essentially no real differences other than software, much like a PC.
--
Rick C.
+ Get a 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209
Just resample the supplied image and you will get a pretty good idea.
There are a lot of them about. But the original image reconstruction will have been as small as they could get away with consistent with being able to suppress off field aliases probably with a guard band.
I am no longer up to speed with the exact details as it has moved on somewhat since the 1980's even if the basics are still the same.
It probably wasn't originally much more than a 64x64 image grid though.
Ironically that is one of the few things that is almost certain. It is really rather hard to image dark features on a disk so the fact that the visibility function they measured is consistent with that structure is definitive that there is a hole in the donut of plasma.
Whether or not the hole is elongated in the way that it seems in the image is the sort of detail that is hard to be sure about. Likewise the radial spikes at 9, 10, 3 and 5 o'clock - they could easily be artefacts. If they are in about the same places in subsequent images of the same object then they are probably real.
It will be a lot more interesting now that they have a phase solution when they are able to add the polarisation map to it so we can see what the magnetic field structure looks like and do some physics.
has a lot of usable information about the EHT project That paper looks like a final report of a multi-year project and the pretty image was created to get a nice looking cover letter for this report :-).
The picture has generated a lot of hype among general public all over the world. It is good that the taxpayers in various countries know what day are paying for.
The real benefit of the project is that now the existing mm/submm telescope electronics have been upgraded to current technology and into a common standard, so that mm wavelength VLBI can be used to study also other objects, not just black holes.
The Rayleigh resolution was 25 uas based on lamda/D
Yes, you are right, originally the Rayleigh limit was used base on separating binary stars in a telescope with a specific diameter. To separate two objects from each other, there must be some darker in between, so we should talk about line_pairs_ (full cycles) and to satisfy the Nyquist criterion, it needs to be sampled at least twice the spatial frequency.
According to chapter 2.1 of that referenced paper, the resolution was
20 uas (only slightly better than Rayleigh). So with 120 x 120 uas field of view, so 12x12 to 18x18 samples should be used, giving
150-330 pixels total.
It wasn't hyperbole, it was a grand project; did you see the list of authors? The disk array to hold one dish's part of the snapshot was about 1 PB (petabyte, not peanut butter). It turns out that 32 of the new-ish helium-filled units can keep up with two 10G Ethernet streams. Five 32-disk blocks for the 1 PB stations, and ten for the 2 PB stations.
It's like a 10 GHz digital o-scope that scans for a week to make each trace.
So Einstein would have been easily able to figure out why the rate of expansion is increasing eh? :-D
-- This message may be freely reproduced without limit or charge only via the Usenet protocol. Reproduction in whole or part through other protocols, whether for profit or not, is conditional upon a charge of GBP10.00 per reproduction. Publication in this manner via non-Usenet protocols constitutes acceptance of this condition.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.