Wormhole theory

[bunch of math cut]

This was the basic problem that tossed these out after Newton.

How can two numbers both be infinitely small but not the same?

One would be smaller than the other but they are both infinitely small which is as small as you get.

Infinite + 1 (or any finite number) is the same infinite number.

1/[infinite] is not going to be smaller than 1/[infinite+1] though you say it is.

Georg Cantor went nuts thinking about these kinds of questions, stop before it's too late for you!

I do want to see how Robinson has done it one of these days. I suspect it's like the square root of -1. You just declare it as a different type of number and deal in multiples of it as you do with j or i. But mixing it in with regular numbers is the path of insanity.

Robert

Reply to
Robert
Loading thread data ...

Oh.

There is a terminology issue here. My phrase "conventional time dilation" is *only* referring to the reciprocal effect of each observer both seeing the others clocks run slow. It is not referring to the real life time increase of relativly moving particles, which is *not* an optical illusion effect.

Kevin Aylward snipped-for-privacy@anasoft.co.uk

formatting link
SuperSpice, a very affordable Mixed-Mode Windows Simulator with Schematic Capture, Waveform Display, FFT's and Filter Design.

Reply to
Kevin Aylward

Fairly simple.. Start with a straight line at some slope Y=m*X+b; the slope "m" si the derivitave and a constant. Step up to a parabola. Make it simple with the parabola pointing "up" (to catch water??) and not too skinny and not too flat, with it symmetrical about the Y axis. Draw a tangent at the bottom (minimum Y); it will be horizontal ("m"=0). Draw another at some point at the right side; positive slope. Draw another on the left side, displaced vertically exactly the same as the previous one; negative slope exactly the same. Now draw a line, using three points: (a) first one X=same as left-side tangent, Y=the slope found, (b) X=zero, Y=zero (the bottom line slope), (c) X=same as right-side tangent, Y=the slope found. This will be a straight line that accurately describes the slope or derivitave of that parabola.

** You probably had a good idea of this anyway, and now are ahead of this mundane explaination (if it merits such a label). Have fun; i have complete confidence in you. Do yourself the courtesy of having confidence in yourself.

and at some arbritrary point along it, draw a tangent: hmmm... looks like you drew Y*m*X+b there.

Reply to
Robert Baer

You know, I had discovered this idea entirely for myself. I'd been taking calculus classes and "doing it the traditional way" looking at these funny dx, dy, and dz things as a kind of "special notation" of which I somehow needed to master the use. I treated them as a kind of "paste on" thing to keep track, but that's about it. And some aspect of being completely at ease and facile always seemed to be just out of reach.

Then in an insight, it all came crashing in. And I got really angry, after a fashion. I was angry at the books and at the teachers for withholding the near trivial insight from me that made all this so very much easier. And I felt cheated that no one had bothered to tell me the simple truth and had kept it hidden from me.

I'm no mathematician and was learning calculus for my physics classes. But decades later, when I was talking with a mathematician friend I know about this, he just said, "Oh, what you are talking about was put onto a rigorous basis in the 1960's."

I was learning calc starting about 1973 and it wasn't until a few years later that it 'hit me.' I'd never read anything suggesting the idea (that I'm aware of, anyway) and since then hadn't dug much deeper into that insight. It was innate and natural and 'felt right' after some exploration, but it was not the least bit well thought out by me. But it sure made sense and made things MUCH easier to follow.

I've no idea why it took so long for professional mathematicians to put it onto a rigorous foundation. I'd have thought they would have been all over this thing (like teenage boys on a cheap prostitute) and that if I'd ever mentioned this to them they'd have just said, "Oh, that's dead obvious. Anyone can see that. Who thinks otherwise?"

But I guess I was glad to hear that it wasn't just an idiocy of my own that couldn't hold up.

The way I was forced to learn calculus was exactly out of the 19th century, the same stuff from when calculus was made rigorous by Dedekind and Weirstrauss, I learned everything from the concept of the "limit," using the epsilon and delta formalism. What a damned pain it all is to me, looking back at it.

I guess I never had to wrestle with accepting the idea of different infinitesimals. I'm not a mathematician so accepting them didn't bother me in the least. And since it was my own internal "discovery" that made all this so much easier to think about for me, it never has crossed my mind to worry much about it, afterwards. It feels natural to me, as it must, since my mind just naturally tumbled to the idea many years ago. I'm comfortable with it, like an old shoe.

I've only just yesterday for the very first time tried to search the web on this subject of "non-standard analysis." I've never read a single paper on the subject, and only knew about it from that conversation I had about 6 months ago with my math friend over dinner. I didn't even know what it was called, because he didn't tell me more. But when others here brought up the name on a different thread, I instantly *knew* that it had to be what he was talking about (the dates matched), so I started looking it up.

I will be getting some of the calculus books based on this idea to see what they say, now. I think it would be interesting to see this from a more thorough and well-considered point of view. For me, it's still in my gut and instinctive and "works well." But I'd like to formalize it some.

And no, I don't think it will be defined quite as you say. More like some kind of distinct, continuous number line.

The way I kind of think about it is that there are smaller and larger infinitesimals along some continuum leading down towards zero itself. And as you go there, you can also take (1/that-infinitesimal) to get a similar continuous line of infinite values. And that this is an easy short cut to seeing why division by zero is uniquely undefined and why division by an infinitesimal is reasonable. If you think of the dx=infinitesimals and their 1/dx infinities, you can see the line of infinities disappearing away into the distance as your dx gets closer to zero, but the connecting line between these two number lines becomes "parallel" at division by zero and thus never intersects the other line -- which is "why" it is undefined.

Jon

Reply to
Jonathan Kirwan

And what truth was this?

Oh?

Indeed.

What your suggesting here is that all those other mathematicians were so stupid for not seeing the "obvious". This is very, very naive. You can bet your boots that this has been looked at extensively. The issue is that these "obvious" ideas, when actually rigorously investigated, usually don't work. There is not a chance in hell that these "obvious" ideas not investigated in excruciating detail, and found lacking in various ways. It is only much later that mathematics became refined enough to go back and see what went wrong when those ideas were *first* rejected.

Indeed, the very first link I got on "non-standand analysis" was

formatting link

"Newton and Leibniz used infinitesimal methods in their development of the calculus, but were unable to make them precise, and Weierstrass eventually provided the formal epsilon-delta idea of limits"

Some of tye problems are identified

formatting link
2nd paragraph

So, with all due respect, I doubt you really understand the details as to why such "obvious" ideas were subsequently rejected when first thought of. NSA is non-trivial. If this were not the case, it *would* indeed have been done prior. Things only become "obvious" *after* the fact. There are millions of "obvious facts" that turned out to be completely wrong.

Kevin Aylward snipped-for-privacy@anasoft.co.uk

formatting link
SuperSpice, a very affordable Mixed-Mode Windows Simulator with Schematic Capture, Waveform Display, FFT's and Filter Design.

Reply to
Kevin Aylward

How fast does the particle think our clocks are going? Or put another way, if you were zipping past the earth at 0.99c, and you happened to look at a clock on the earth, would it be going slower than yours?

The point is that your perception of time is a function of relative velocity. The particle decays after a certain interval, but that interval is perceived by stationary observers to be longer than what is experienced by the particle itself. There isn't a problem with this, because your *perception* of time in objects is affected by their relative motion to you; if you consider a motion in some other reference frame, that motion generates an interval, the difference of two spacetime points. Your motion through 'time' also generates an interval. The projection of the interval created by their motion onto the interval created by your non-motion is the perception you have of their velocity and elapsed time, at least according to SR.

A rendezvous between two particles at two separate points in spacetime is, as you (and JW) say, a different matter, requiring accelerations, which orient the trajectories towards one another again. (Can these accelerations be dealt with properly using SR? Even in x and t?) However, thinking about it as a path through spacetime makes it much more believable that the elapsed time experienced on a particular path would be different for different trajectories.

Now, on a separate but related subject, I wonder why there should be a difference between inertial paths and accelerated paths. What is an 'accelerated path' anyway? If I'm orbiting the earth, I don't feel any acceleration, but I'm apparently aging a bit more slowly. Your math pages link points out the idea that two orbits, one circular, and one eliptical, will experience different 'proper time'. Neither can tell they are accelerated unless they look out the window. Thus, by comparing notes when they cross paths, they could violate the principle of relativity, and figure out how fast they were going in relation to one another. I guess GR covers this.

Also, when one thinks about the spacetime momentum+energy vector, and sees that mass is just momentum that we happen to be travelling along with, it all gets a bit confusing. To a photon, YOU are the photon, and any photons it happens to be travelling with in the same direction are mass, perhaps making up photon planets and galaxies. To it, your time is standing still, just like to us, a photon experiences no time. An infinite number of parallel realities, corresponding to the infinite directions a photon can move...

Nyuck Nyuck Nyuck.

--
Regards,
   Robert Monsen

"I'm tryin' ta think, but nuttin's happenin!"
                                Curly
Reply to
Robert Monsen

No, I'm not, Kevin.

I have no such sense at all and it must have been the poor way I wrote what I did, if that came across. I had assumed that all this would have been dealt with on a rigorous basis a long time ago -- yes. And I frankly have absolutely NO IDEA at all why it took so long -- yes.

But I also know that my experience is terribly limited in this regard and I assumed then and still assume know (in other words, I believe) that there are VERY GOOD reasons why it did take that time and that I just haven't yet been exposed to them.

I just don't know what they are, that's all.

No, it's not. Because that isn't what I was thinking, despite how it may have come across to you. Sorry about that.

Agreed. I just had no idea about "why until 1964?" Kevin. But I'm sure there are good reasons for this.

Well, as I already pointed out, Kevin, I've not yet (but will soon) read up on it. I'll be ordering some textbooks this next week to add to my library when they arrive and I'm sure I'll probably be much better for having done so. I still don't know why it wasn't put on a rigorous basis until 1964 or so, but I'm sure it will become clearer to me over time.

My own individual awakening was a personal one and I didn't mean it to reflect on anyone else, Kevin. It was just very nice when it happened and I was surprised a bit that my teachers and their textbooks seemed to studiously avoid saying anything like it. But my exposure is limited. So that may all there is to say about it.

I apologize if I left the wrong impression.

Jon

Reply to
Jonathan Kirwan

^^^^ now

Reply to
Jonathan Kirwan

Ok.

ok.

I read into this that there was a criticism being presented, when there wasn't.

And, apparantly, there are reasons why this "new" 1964 approach, is not all that its craked up to be.

This link,

formatting link
at the criticism section is interesting.

Maybe never. I can't say I'm convinced that this later approach adds anything of real value. The issue is that major breakthroughs taht give people grand prizes don't happen very often, e.g. Shrodinger equation, General Relativity, tellytubbies etc, so people simply invent some "wonderful" thing, that isn't.

I probably jumped the gun a bit here Jon. Its hard to read between the lines as to how people are really thinking.

What I have found is that there are very few things that are "simple" that have not been already looked at in great depth. There's always a catch.

Kevin Aylward snipped-for-privacy@anasoft.co.uk

formatting link
SuperSpice, a very affordable Mixed-Mode Windows Simulator with Schematic Capture, Waveform Display, FFT's and Filter Design.

Reply to
Kevin Aylward

Terry,

Your teachers were probably either brilliant and therefore figured it was 'obvious' and not worthy of mention, or else they weren't so savvy and didn't recognize it themselves.

:-)

Reply to
Joel Kolstad

"Looking" screws up the data. See Heisenberg's Uncertainty Principle.

...Jim Thompson

-- | James E.Thompson, P.E. | mens | | Analog Innovations, Inc. | et | | Analog/Mixed-Signal ASIC's and Discrete Systems | manus | | Phoenix, Arizona Voice:(480)460-2350 | | | E-mail Address at Website Fax:(480)460-2142 | Brass Rat | |

formatting link
| 1962 | I love to cook with wine. Sometimes I even put it in the food.

Reply to
Jim Thompson

This doesn't ring true. The times can be computed exactly. There is no uncertainty. Both see the other's clocks as going slow.

Feynman* actually has a great explanation of this whole thing. He describes a light clock, where a light beam is going back and forth between two mirrors. The mirrors are oriented so the surfaces are orthogonal to the direction of relative motion. Each time it hits the mirror, a 'click' happens. The guy in the rocket carrying the clock sees it going back and forth, ticking at a certain rate, defined by c and the distance between the mirrors. The guy on the ground hears the ticking at a slower rate, because to him, the beam must be going in a zigzag direction, because the rocket and mirrors are moving with some velocity u. If you work out the distance with the pythagorean theorem, and assume the light is moving at c, the guy on the ground hears ticks at a rate which is smaller by a factor of sqrt(1 - u^2/c^2), which is what is predicted by those Lorentz transforms.

However, it's obvious that the same thing will occur if a spaceship guy looks at one of these light clocks on the earth as he passes. He'll see it as ticking more slowly than it should be ticking.

An attempt to explain this is given by Feynman*, where somebody mistakenly believes that he can measure width and depth of an object by the angle different surfaces make with his eyes (the surfaces' "apparent size"). This is clearly false, since you can rotate the object and get different apparent sizes for the different surfaces. Same thing, except for spacetime, it's relative velocity and not position that is doing the rotating.

(* Lectures on Physics, Vol I)

--
Regards,
   Robert Monsen

"Your Highness, I have no need of this hypothesis."
     - Pierre Laplace (1749-1827), to Napoleon,
        on why his works on celestial mechanics make no mention of God.
Reply to
Robert Monsen

I read in sci.electronics.design that Jonathan Kirwan wrote (in ) about 'Wormhole theory', on Tue, 29 Mar 2005:

That concept is a lot older than the 1960s. You can see it in l'Hopital's method for evaluating functions that have a limit of the form 0/0 for some value of the independent variable.

Incidentally, I learned differential calculus the 'old' way. I'm no mathematician, either, but I didn't find it difficult. I had a *good* teacher. That makes a lot of difference, maybe all the difference.

You may need the 'limit' concept for the geometrical applications of calculus, staring with the gradient of the tangent.

--
Regards, John Woodgate, OOO - Own Opinions Only.
There are two sides to every question, except
'What is a Moebius strip?'
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

I saw this very demonstration in animation on teevee about fifty years ago. It was a Disney/Bell Labs thing, I think. And it didn't go "click", it went "boink." ;-) The one time I did get to go to Disneyland, Tomorrowland was closed for repairs or some such. )-; (in retrospect, given the timeframe, they were probably transforming it to punkland or gothland or something.)

Thanks! Rich

Reply to
Rich Grise

Yeah! All you have to do is make delta infinitesimally small, and you're pretty much there. Although you do have to introduce limits, cf. Zeno's paradox.

Cheers! Rich

Reply to
Rich Grise

The particle can't think :)

Reply to
~~SciGirl~~

They were using your mental picture of infinitesimals from Newton up to at least the Bernoulli brothers for calculus. IIRC, they finally went to the limit concept of Weierstrass, Dedekind, and others because of problems with the infinitesimals. You could use the infinitesimal to derive nonsense. If you are interested in the details take a look at a very readable explanation in Edna Kramer's " The Nature and Growth of Modern Mathematics"

Amazon shows used copies from about $5.00

formatting link

Robert

Reply to
Robert

Thanks, I'll look.

From what little reading I've done in the last couple of days, the gist is that Abraham Robinson has put the physicist-type insightful methods of Newton (and what I tumbled to, when studying physics) on as solid a mathematical footing as what Dedekind and Weirstrauss did in the 19th century.

Personally, I much much prefer this geometrical thinking mode and it has NOT let me down, in terms of producing the same results others produce when using the traditional techniques, and enables faster and clearer insights. I'm not likely to let loose of it. It's worked too well.

But I need to dig into non-standard analysis and see. I'll get the book you mention above and also some of the newer textbooks teaching

1st year calculus using non-standard analysis. But I doubt that mathematicians would write these textbooks unless they had satisfied themselves that it was a solid idea. Time will tell me, though.

Thanks, again.

Jon

Reply to
Jonathan Kirwan

Don't get me wrong. The book I mention has the details of why they had to get away from the infinitesimal concept in math. Non-standard Analysis isn't dealt with.

Robert

Reply to
Robert

Understood. I'll start with non-standard analysis textbooks then and see what that looks like, first.

Jon

Reply to
Jonathan Kirwan

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.