Is microprocessor an integrated circuit???

The ISO appears to sell it for $399. Those kinds of publications are pricey, thats a fact of life. If someone found it for $18 ? sounds like a bargain

Or maybe Im so smart, so amazingly qualified and so experienced that Im not afraid to be misinterpreted or misunderestimated

I admit that I want a refund on some of the tuition I paid because the school was incompetent, I was overqualified. You jump to conclusions, not very open minded

I said an 8 bit byte can represent the range of -127 to +127 and you said: "Nope"

Its like arguing that the inch scale of measurement is arbitrary, its not. With only a few rare exceptions in history, a byte is 8 bits, and is the standard. We note with disk and memory sizes, they are rated in Mega or Giga bytes. Network interface speeds in Mega bytes per second.

Western Digital could double their disk size by saying a byte is now 16 bits wide.

One of the other factors in using the byte system is the Base 2 math, and unless IM mistaken shifting left or right will double or divide by 2 the binary base 2 number. Designers would therefore prefer a byte that is a multiple, 2, 4, 8, 16, 32

But since its already 8 bits, thats how its used.

When a person begins losing a debate and has no declarative or informative contribution, the discussion degrades into separate and unique steps, personal attacks (pig stupid), then profanity, then rage, then violence.

If you claim to have understood binary at that specific point in time? you would be over 100 years old. Congratulations that you are coherent and not senile

Yes I researched the site, the oddball, erroneous, unauthorized uses of the term "byte" exist. But the SAE measurement of inches has a better chance to be converted to metric than the byte has a chance of being other than 8 bits.

When a byte is represented as 9 bits, the 8 bits remain as the data the extra bit is a parity bit, meaning its a type of wrapper component, not a literal 9-bit word. If anyone in history combined 3 octal digits into a 9 bit word? its technically not the same.

The official C standard is rhetoric?

Thats what it means, by the majority of the industry, and your failure to provide any examples in the industry show that I was correct

Well then you wont want to know Ive contributed for years to make those things that launch into orbit whatever they are called, go up and keep the range safe so they fly up there and spin around or whatever they do.

I dont believe a McD's would hire me with my resume, too overqualified

Okay you win, Im sorry to upset you so much, if you want to say a 6 bit byte is a byte... no, I wont do it, youre wrong, I cant even pretend to concede, its an 8 bit byte

Well if you knew binary over 100 years ago, you should then of course know about the Hollerith punched card and what 2629 is.

I like punched card equipment, it was fun to work on with the mechanical stuff and electrical also

Well people do say everyone has to work harder when Im around

Reply to
Bradley1234
Loading thread data ...

Sure, sorta. I can tell a ROM from a PLA, but perhaps not a PLA from a ROM. Think about a 2D map of inputs vs outputs. A PLA (after arranging the I/O) will have square holes in the map 2^n in size. A ROM likely won't. Given a random change of inputs, a ROM will also have a probabilitiy of a change of outputs greater than that of a PLA. Or put another way, a PLA contians less information than a ROM.

--
  Keith
Reply to
keith

Ah, man! Even though you live in SF, I always thought your were more of a herrrrr kinda guy. I'm shocked! Shocked, I tell ya!

--

  Keith
Reply to
keith

Besides, if you feed "10 bit byte" into google, you get among others an IBM document discribing a system that uses 10 bits in its bytes. This is not some old computer from the stone age it is new IBM hardware.

At this point if Bradly1234 said the sky was blue, I'd look for myself.

--
--
kensmith@rahul.net   forging knowledge
Reply to
Ken Smith

bits

Okay, I got that one wrong, finally something I was actually mistaken on, obviously it was intended to be "an arbitrarily different size of the defined byte, to double disk space" 4 bits then

Reply to
Bradley1234

I quickly found 10 bit/byte also:

formatting link

quote: A sequence of 8 bits is so important that it has a name of its own: a byte. A handy mnemonic for byte is binary digit eight. Thus, it requires one byte of information (or 8 bits) to transmit one keystroke on a typical keyboard. It requires three bytes of information (or 24 bits) to transmit the three-letter word the.

end quote

  1. Science Terms: Distinctions, Restrictions, and Confusions

§ 10. bit / byte

Care to post a link to that document? Is it the currently accepted industry standard as I asked?

Reply to
Bradley1234

myself.

I recall from the movie about Alexander Graham Bell that when Watson plucked what can be thought of as an accordion reed over what can be thought of as an electric guitar pickup, Bell remarked that they were the first people in history to send a musical note over a wire. Watson replied that this was nonsense, since Bell had only heard what every electrician had heard at one time or another during the natural course of his work. The difference, of course, is that Bell had understood the significance of the event.

Reply to
MikeMandaville

The VAX had POLY: evaluate a polynomial. Most of the transcendental math functions were microcoded, too.

John

Reply to
John Larkin

So buy it and read it. You're in for an eye-opener.

Or so arrogant that you cannot let go of a long held belief in the tooth fairy after being told by your dentist that they don't exist. "But my mommy wouldn't lie."

They were incompetent, obviously. Your education is sorely lacking as a result. Add that to your suit.

You didn't say "8 bit byte". You said "byte". Even so, you're still wrong. An 8 bit byte can store numbers from 0 to 255 (unsigned) or -128 to +127 (signed two's compliment).

The "nope" stands.

Its not rare and can be redefined. The definition of a "byte" is dot fixed at a certain size. Get over it. Learn your lesson and move along.

If they were selling it to be used in a 16 bit byte machine, sure (wrong way, but I get your drift). Since they're advertizing it (and it's formatted for) a system that uses 8 bit bytes they have to be consistent with the size.

I bet it torques your jaws that they measure the size in decimal too.

Six bit bytes were common. IIRC the CDC Ciber series had a 60bit word and a 6bit byte. As has been noted IBM talks of a 10bit byte in one of the I/O processors.

Nope. It' used by the ignorant to mean eight bits, or it's understood by the context that it's eight bits. A byte is in no way defined as being eight bits though.

Violence? No. Frustration with a pig-headed troll? Certainly.

More than half that, but I was doing binary (and in all bases up to 32 - got awkward above) in fifth grade, well more than forty years ago. You simply sounded like a snot-nosed kid who thinks he knows everything.

How many bits are in a byte?

You're hopeless. I hope you're not involved in any engineering more complicated than a toaster.

Did I say anything about parity or prepresentations of binary numbers? I'm talking about sizeof(byte) *not* being fixed at eight. It is

*usually* equal to eight, but if you assume that it's a fact it will come back to byte someone. ;-)

IBM? Yes. The official 'C' Standard. Yes. Life? Yes. Get over it, you're wrong.

I've proven your definition wrong by example. There are many "microprocessors" that are *not* microprogrammed. There are many microprogrammed processors that are *not* microprocessors. An IBM 360 is hardly a microprocessor, though the "same architecture" some forty years later, known as the z-Series are, in fact, microprocessors. Some models are microprogrammed, some are hard-wired. The terms are othogonal.

Yet you've not shown one authoritative example of a VAX-11/780 being called a "microprocessor".

They're afraid you couldn't learn the process.

You can continue life being ignorant or you can learn. Your choice.

I know what an 026 is, and an 029. I used them in college, and a couple of times since. 2629 to me is the model number of my laptop (ThinkPad A21p).

My guess is that you're not talking about a ThinkPad.

To clean up your messes, no doubt.

BTW, a decent newsreader would be a good "investment".

--
  Keith
Reply to
keith

meaning.

;-)

How do you know it wasn't used before? Maybe it was but you just haven't heard it. :-) That's what Patterson said, that *some* people used the term for that before it became defined as a single chip processor. It's not my argument, it's Patterson's. He's not God, but he has been around the field about 10-20 years more than you, and among other things, he is more or less the inventor of the first RISC processor, so I would tend to trust him more. :-)

Reply to
microx

Sure. We do a lot of it. What's BiCMOS got to do with definition of "byte" or "microprocessor".

You simply can't read. I gave you a link to a site that quotes the 'C' standard clearly defining a byte as not necessarily being eight bits. Why would the "sizeof(byte)" function be in 'C' if the answer were a constant?

tell you what... GO over to one of the "architecture" groups, like COMP.ARCH, or COMP.LANG.C and propose that a byte is defined as eight bits or ask why sizeof(byte) is needed since tha answer is *always* 8. Be warned; wear your nomex panties.

Why, they're lieing, aren;t they? Clearly a gigabyte is 2^30 bytes, no?

Ask on alt.folklore.computers. The users, developers, and architects hang out there. That doesn't change sizeof(byte) is not a constant.

Well, the fact that you picked up on two wrong assertions in the same thread and fight like a tasmanian devil to keep your ignorance, the thought had crossed my mind. I've more than once looked down my throat for a hook.

I understand completely.

OMG! And to think I'm a nuke proponent. Well, it's clear I can't correct your ignorance, but you've certainly done the trick for me!

Wrong. Why would it be there if it was a constant? Hint: It's there

*because* the programmer cannot assume it for all ISAs.

The 'C' starndard doesn't say it has to be any size, but must be big enough to represent the character set, which in practice makes the lower limit six.

That really scares me! SOmeone *so* arrogant to contradict the standards doing *anything* more critical than making toast.

Why are you changing subjects? We're talking "byte" not "char".

Nothing (other than they have representatives on the various standards committies). I gave two (not one) examples of where you're wrong.

THe processor is not a microprocessor. It is made of several components, this *not* a microprocessor. Yes, I was instrumental in purchasing an

11/780 in the mid 80s. I know a *liitle* about it. Tell you what, trott on over to alt.folklore.computers and tell the good folks over there that the VAX-11/780 was a microprocessor. Again, wear your nomex shorts.

Irrelevant. So were most of the IBM 360 line (only the /75 was hard-wired), but they were *not* microprocessors by any accepted meaning.

MBA types have shown that they can learn. What it is they learn may be in question, but...

You've been shown you're wrong, but will not see.

If most here said the sky was blue we'd know form previous experience that it was likely to be so. If you said it, we'd have to rethink our experience and check it just to make sure. Is that clearer?

formatting link

That still doesn't change the definition of a "byte" or "microprocessor" just to make you happy.

If you're so interested in corporate "ethics" and other "dirty tricks", why are you funding BillG and M$.

Ah, so you're just a prick that turns in your cow-orkers to management.

BTW, a decent newsreader would be a good "investment". But at least your not a top-poster, so there may be hope yet...

--
  Keith
Reply to
keith

You have any idea what the C sizeof operator does?

Best regards, Spehro Pefhany

-- "it's the network..." "The Journey is the reward" snipped-for-privacy@interlog.com Info for manufacturers:

formatting link
Embedded software/hardware/analog Info for designers:
formatting link

Reply to
Spehro Pefhany

In article , Bradley1234 wrote: [...]

I said from IBM. which is a small typewriter company and actually of little importance in the computer business, but if you care to read the document it is at their site as:

formatting link

--
--
kensmith@rahul.net   forging knowledge
Reply to
Ken Smith

Not that I would defend IBM, its funny you reference them that way. I must conclude you know a great deal about IBM

It appears in the article that the 8 bit byte is enclosed in a 10 bit wrapper, probably a parity for every 4 bits?

Reply to
Bradley1234

I read in sci.electronics.design that MikeMandaville wrote (in ) about 'Is microprocessor an integrated circuit???', on Sat, 29 Jan 2005:

Shouldn't he be sued, then, for facilitating wholesale breaches of recording copyrights? (;-)

--
Regards, John Woodgate, OOO - Own Opinions Only. 
The good news is that nothing is compulsory.
The bad news is that everything is prohibited.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

buy

your're

like a

Show me where the official version is on sale for $18

legal

You

not

The tooth fairy? She is married to Jergen Von Strangle, the toughest fairy in fairy world

Let me ask you a simple question, do you know what BiCMOS is? (not a trick question)

You have never shown any evidence of this being the current standard, yet you continue to claim the byte definition is arbitrary, and want to hurry to change the subject now when there is nothing to support what you thought was an arbitrary byte size

right, 4 bit byte would double it not 16. but decimal? not a problem

the CDC Cyber series, but is there any documentation on those old Seymore designs? how can we verify the internal cpu stuff anymore? They got much of the architecture from Univac some would argue

And since we are being detailed, the 10 bit byte in an IBM (a small typewriter company that doesnt influence the computer world) uses an 8 bit byte for data, and 2 bits for parity. Therefore that 10 bit byte IS an 8 bit byte, parity is not part of the byte data, its a wrapper

Im here responding and taking smack from people and discussing the point, hardly trollish. newbie to the forum, yes.

Yeah, a lot of people say that.

I was trying to get work at a nuke u lar facility doing control systems. seriously

sizeof() sounds very C-ish and in 100% of the time, is 8 bits wide, in application. The only thing close to challenging this is a quote from a $400 book that probably doesnt have pictures, that says byte is at least 8 bits.

Find anywhere in C code today that redefines sizeof(byte) as other than 8 bits that would suggest the byte isnt commonly 8 bits. I would design a nuke u lar system controller thing and rest assured a byte is 8 bits and not worry at all.

Sizeof(char) or ulong or ushort? They are all relative.

what does ibm ( a small typewriter company) have to do with the official C standard?

The dispute was that some thought my calling it 11/780 cpu somehow implied the entire system. Have you ever seen a real working 11/780? Ever taken out the cpu board set?

its micro programmed, IIRC

That should be a challenge for an MBA type, to apply to McD as someone needing a job, get hired, do well, make improvements, get some award etc and not tell them about the MBA thing, do it for the exercise.

!

Im not from Missouri but you just have to show me. I think when we examine the details of a web link we do not actually find a change in byte size definition. I like to learn but need fact checking, not assuming something.

Somebody said that if "I" claimed the sky was blue they would double check it? Good for you! Thats how it should be, not kissing somebodys back end and assuming they can make decisions for you, learn to do the work yourself.

No, the original punched card equipment

formatting link

I guess ISO 2629:1973 isnt used much anymore. It was one of those tactics that ibm (a small typewriter comany) used to force equipment makers to adopt one standard so they could get competitors to copy it, then ibm would change and make the best equipment with a different standard, forcing smaller companies out of business.

No, I mean people tend to be lazy and pawn off work onto others or spend the time web surfing. If you can imagine, people get annoyed with me when Im persistent and in their business asking for results at work. I drive people to get results, irritating them like nails across the blackboard which I find soothing to listen to. In past jobs co workers complained about me, the managers wanted to promote me. doesnt make sense.

Reply to
Bradley1234

In 1981 Modcomp's president went to a California design firm for the design of their fully 32-bit machine, the 32/85. I remember the section of firmware for the transcendental instructions that were added to that machine were entitled "Transcendental Meditations" by the programmer.

He was in a hurry for the project to end because he was scheduled to attempt Mount Everest that year. Name of Igor Mamadlian IIRC.

Earlier machines implemented the Floating Point instructions on a separate microcoded processor board. The I/O processors were also separate processor boards. The Space Shuttle launch control system uses Modcomp computers with extra customized CPU boards that are used to communicate with other computers in the system through the 64KB Common Data Buffer.

I designed the CPU of Modcomp's 9250, the gate array version of the

32/85 design. Reduced six 14.5" by 19" boards to one. The Instruction Stream Processor was one 55K-gate standard cell. The Memory Management Control used one gate array. The Data Cache Controller was implemented in two identical gate arrays.

The WCS was implemented in SRAM SIPs as was the register block storage which also held the constants for those trancendentals among other things. All external to the ISP ASIC.

We never called any Modcomp minicomputer or CPU board or even the ISPs of our gate array designs microprocessors. Gee, I guess I should update my resume. According to Bradley I've been designing the damn things for years! Shucks folks! I've designed microprocessors in wire-wrap!

--
Thaas
Reply to
Thaas

Hey, it was a trick to see how deep "Bradley"'s knowledge goes. The real difference is in how the PLA contents are described to the software that creates it. It could be a table, or it could be equations.

For most of the 1970 vintage microprocessors, given their behavior (especially for invalid instructions), I'd expect equations.

Mark Zenier snipped-for-privacy@eskimo.com Washington State resident

Reply to
Mark Zenier

Coldfire executes most of the 68K instruction set, and it's not microcoded.

John

Reply to
John Larkin

Go read more carefully. That isn't what they are doing.

--
--
kensmith@rahul.net   forging knowledge
Reply to
Ken Smith

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.