Nick,
let it rest; if they are more interested in pushing you down there is nothing you can do about it unless they get experience in designing hardware circuits. It's HARD to get into the mindset of _how_ the hardware actually works if you have only software exprience. Knowing assembly is very dangerous as it gives the false impression that you know HOW things works internally.
Hardware does not work serially. It is made up from many, many functional units which all work at the same time. They give the false impression that they work serially, because the interface is serial: one instruction goes in, result comes out. Right? Wrong.
There is no clock frequency. It is a mirage. All the chips operate at the speed of light. Electricity goes throgh very tiny tiny transistors and the flow from various inputs is controlled by the very same invention. Larger scale logic is built using these tiny things.. these things make more complicated operations.. these are logic libraries. Different processes are used to make these millions and billions of transistors.. these processes have their own logic libraries.. higher level tools are used to express logic. The higher level presentation is *different* for different process and libs.. some libs are totally shit, some are better.
Logic circuit does not need to be clocked, there is a lot of logic done w/o anykind of clock. The circuit knows the result is ready by reading a line, if there is current, the logic circuit has completed one iteration. This kind of logic circuit often takes less power.. but at SOME level you have to have a clock driving the interface, so that it is more convenient to talk to outside of the logic. To connect different blocks together, at least.
There are different classes of logic circuits.. clocked, clockless.. synchronous.. asynchronous.. the interface is clocked and serial for CPU's because it is very difficult (understatement) to atomize I/O (among other things) without a clock.
My formal education is in electronics field, but I work writing microcode and drivers.. and I got into the current job as result of a very long chain of events which isn't very interesting story to share. The short version: I gave programming a go when I was 11 or so. It was fun. I did graphics. Too slow. Use assembly. z80. m68k. mips. x86. school. arm. pascal. c. c++. ocaml. perl. higher end higher level. working on graphics. rasterizers. scene graphics. directx. glide. opengl. games. graphics. consulting. hired by semiconductor firm. been there ever since. happy end?
The point being, I climb the tree by my ass first.. the software/ assembly mindset is SO WRONG when thinking hardware. You need both if you're architect; you need to be able to design good hardware and part of good is that you can write a good driver for it. That's why there are a lot of software and hardware engineers. senior. staff. you name it, but fewer architects.
Good night.