ISE sends sensitive information to Xilinx site!

This afternoon I run a flow (ISE 8, Linux) and got in the log a warning message. The word 'Warning' was an hyper link so I clicked on it hopping to get a more detailed description of the warning.

Few seconds later I was surprise to find myself in Xilinx site in a page that displays the full path of the file I compiled. The path included sensitive information such as my name, the name of my employer and the code name of the project I was working on.

Xilinx, please be more sensitive to the privacy of your customers.

Jim

Reply to
Jim
Loading thread data ...

Hi Jim,

If true that's a horrible thing for them to do, but remember that what you see in your browser window resides on your own computer. I'm no web guru, but it might be possible that the Xilinx website generates Dynamic HTML (or Javascript, etc) that is interpreted by your browser and used to create what you see your screen without actually transferring any sensitive information to Xilinx.

For example, when they dynamically create a web page for your browser, they might send a reserved environment variable name instead of the real path (for example), which your browser then renders as the actual absolute path.

Perhaps a Xilinx rep could clarify?

Ron

Reply to
Ron

Hi Folks,

Altera does the same as well. In Quartus II, they have a feature called "TalkBack" which reports back to Altera via an XML file, details about the software tools you are using(including synthesis, simulation, timing analysis and "others" , design constraints, IP usage, name of top level file, time of compilation etc). Also reported back include hostid, NIC ID and C: drive info), which they state in their EULA (whoever reads that !!!) that they may use to determine the identity of the user.

Even if you are disconnected from the Net, all details are saved for later transmission. However, I'm sure it is all used to help the user....hmmm.

Bob

PS you can disable this feature, but I'll let you read the EULA to enable you to do this.

Jim wrote:

Reply to
stenasc

The amount of web linkage is getting very annoying. Personally I block all requests with my firewall and run some machines internet isolated but this does lead to occasional other issues as the tools are beginning to assume the connection is always there to web. Question is - how long until license codes for software rely on a web access for authorisation?

John Adair Enterpoint Ltd. - Home of Raggedstone1. The Low Cost Spartan-3 Development Board.

formatting link

Reply to
John Adair

On a sunny day (Thu, 25 May 2006 01:07:48 -0700) it happened Ron wrote in :

Just run snort (packet sniffer) snort -i eth0 -v -d > test.txt

Thne grep for your project's codename in grep my_secret_project_name test.txt

If it shows anything move to Altera.

Reply to
Jan Panteltje

On a sunny day (Thu, 25 May 2006 10:30:29 +0100) it happened "John Adair" wrote in :

Or worse, as Billy Windows long time ago suggested: The tool (applictation) will run on the server. So you upload your design.... secure link of course ;-)

It has no quite happened that way, although some movies and audio sites try hard.

Maybe you then simply pay for access time to the tools. Solves any update problem too.

Reply to
Jan Panteltje

That, of course, is called "time sharing", and is what we used 30 years ago, before PCs arrived. Back to the future...

Reply to
David R Brooks

On a sunny day (Thu, 25 May 2006 03:11:39 -0800) it happened David R Brooks wrote in :

Not all old ideas are bad.... Fire is also an old idea. There are more.

Reply to
Jan Panteltje

So is lighting one by rubbing sticks. Good luck.

I don't want to return to what we did 30 years ago. Those of us who had to deal with 1Mb of memory with multiple power rails in three fan-cooled cabinets (and processors in eight) are glad to see the end of them and all the hassle.

There's no point pretending that processing power and RAM are still expensive. They're cheap. They're very cheap. That's why we now have it locally. Do you want a server to render your graphics images, too? Or maybe you'd prefer to do everything on a command-line? Get real.

Reply to
MikeShepherd564

On a sunny day (Thu, 25 May 2006 14:14:16 +0100) it happened snipped-for-privacy@btinternet.com wrote in :

I think you mis the point. Now, to do the synthesis, many people have to buy advanced (very fast) hardware. A FPGA vendor could team up with say (for example) Sun, and you would use their server farm. The FPGA vendor would take care of all updates and software related problems transparent to the customer. Think how many man hours you spend installing, updating, finding install problems, with the XST, things you tried to get it working. And then multiply that by the amount of people using it. At the current cost of manhours, and now you do not need the latest hardware, no new software purchases, site licenses made simple, there could well be a financial advantage. ESPECIALLY if the server farm was significantly faster then the normal high end PC used by (for example) you today. That is also [saving] hours (waiting for a design to finish). Waiting for software to ship, etc etc. If I really listen to your blunt remarks I almost think you have no clue about software at all (regarding the graphics remark). There is only very little data to be transferred (listing and bitfile returned, some graphs, really not a lot), but a lot of calculations to be done by the software. The perfect setup for a client server model. The FPGA vendor could then also team up with other companies to make the best tools availabe at all times to all. That would 4 sure make thing a lot better to work with. There actually exists a PCB manufacturer (cannot remember the name of that company) that lets you do boards that way (with their own soft). People seem to be satisfied with that.

So, save time, money, have the latest bugs fixed all the time, save disk space, hardware, no illegal copy problems for the FPGA tool vendor, work faster, it would make sense to work it out in detail.

Reply to
Jan Panteltje

Hi all,

do you really expect anyone will not grab as much data of interest as possible if they can? Are people really that naive? This case has been caught only because it has been done in a clumsy way - plain http transaction. What about all those "autoupdates" and stuff all the windows etc. software does? How difficult is to interleave the update data with the secretly sent one? How difficult is it to encrypt it a bit so no grep will catch it?

Get real, using a wintel PC means Big Brother is watching you. Some of his smaller cousins included, probably (as in this case).

Some of those who have wondered why I am so fierce about using my own software under my own OS etc. may now begin to understand part of my motivation ...

Dimiter

------------------------------------------------------ Dimiter Popoff Transgalactic Instruments

formatting link

------------------------------------------------------

snipped-for-privacy@yahoo.com wrote:

Reply to
dp

problems

It's a very interesting idea (I was thinking about this sort of model the other day). Some of the barriers to adoption would be:

(1) Privacy concerns - do you want your company's crown jewels stored on someone else's server? I think vendors would have to guarantee not only an encrypted link, but strongly encrypted storage for your files as well (and only you have the key).

(2) Tool versions - you'd have to be sure that the vendor wasn't going to switch your build process over to a new version of XST (or whatever) without your permission, or you'll never get any stability;

(3) Availability - could you trust your FPGA vendor to provide 100% uptime? Recent experiences with companies A and X's websites point to "no". :) What about DoS vulnerabilities?

(4) Licensing - the vendor will really have you in a corner if you don't pay up (they could potentially just cut you off and confiscate all your files). Note that I'm not advocating that as a business practice, just pointing out that many people have trust issues here.

Most of those problems are things that "web application delivery" advocates are already thinking hard about anyway, so if that idea gains traction I expect these barriers will slowly fall.

When "compute" truly becomes a commodity, the most likely model would not be "you get to use your FPGA vendor's server farm", but "your FPGA vendor re-sells compute from someone else who runs a server farm, with the FPGA toolchain as a value-add". Which I think is what you meant by "teaming up". I mean, who wants to maintain a data centre / server farm on their own premises any more, anyway? It's expensive, labour intensive and unrelated to most firms' core competencies.

In conclusion, the future looks like the 1970s, just overclocked and with better hair...

-Ben-

Reply to
Ben Jones

Such nonsense. If a GHz range CPU is not enough I don't know what is. The hardware costs are negligible from a developers point of view nowadays. Now if the software has been explicitly written in a way to clog the system this is of course another matter...

This is only a relatively small part of the problem. The thing is, they want to sell you something - some sort of software - which afterwards will make you pay them on a regular basis. More than that, they want to be in a position to take your IP - encrypt it as much as you like, the chip vendor can reverse engineer it any time he wants - and if of interest, incorporate it as a "core" or whatever, you will be allowed to complain from the cold outside at will... For me, anyone selling me a product incorporating secrets which can be used to make me pay more or control me in some other way at a later stage is in the blackmail busyness. Unfortunately a great part of todays industry is heading this way (PLD vendors being actually a not so big part of the phenomenon).

Dimiter

------------------------------------------------------ Dimiter Popoff Transgalactic Instruments

formatting link

------------------------------------------------------

Ben J>

Reply to
dp

And 640K will be enough for anybody...

Have you ever tried multi-pass P&Ring a multi-million gate FPGA design with marginal timing? Plenty of people would pay good money to have their compile times dropped from 12 hours to 12 minutes if the technology were available.

Not if he doesn't have the key, he can't. Of course, the design has to be decrypted and loaded into memory for the tools to process it, I guess...

That'll be those trust issues I was talking about, right there. :)

-Ben-

Reply to
Ben Jones

No, I think you're just annoyed that I make my point forcefully.

Well, I use a 3GHz Pentium with 2Gb RAM. It wasn't expensive. What would these servers use? Something like a 20GHz Pentium maybe? How much are those? Where do they get them?

When I compile a design on my PC, it gets 100% of the CPU right away. Would your imaginary "server farmers" be happy to have their machines idle most of the time, ready to respond like my PC does or would my design often have to queue? Don't forget these downsides to the glorious world you imagine.

Actually, I've done little but software for the last 25 years, but that's not the point. You claim you can strip down the needs of my desktop PC (presumably to something like a 286 with a few meg of RAM). Well, to reduce the cost significantly, you'll also need to take out the graphics card and some other complex bits and pieces. But I like my graphics card. It means I can have fast, hi-res graphics in colour. You can't send me those graphics as bit maps from your imaginary servers and I don't think you can send them effectively in any other form, either, even if you allow me a more powerful processor to render the images. Being happy with a PC isn't about feeling that you got super value-for-money. It's much more about response times and having those response times predictable. If you feel irked when Google pauses for a few seconds, think what it will be like when the same happens as you try to flick between two pages of your EDA tool.

Nobody's stopping you. But I'd like to see the results, not the optimistic theory. Why not contact Larry Ellison with your ideas? After all, his was perhaps the highest-profile attempt to sell this idea of thin clients and powerful servers (though we don't hear so much of it now).

Reply to
MikeShepherd564

Actually, we run a lot of our software here through Terminal Services on a Win2K Server. There are lots of arguments for doing it. The high cost of processing power and RAM is not really one of them. The arguments that convinced us to do this:

  1. Only need to install software on one computer rather than many. Software installation and upgrades can be a very time consuming (and expensive) process.
  2. Only one central computer (or rack of computers) needs to be protected in the event of power outages or data loss. All of the terminals can be treated as disposable, and not needing much performance, and in the event something happens, wiped clean or replaced.
  3. I can start a job before I leave work (or from my laptop when I get on an airplane), suspend the terminal session, and log in from somewhere else (even Kinkos), on a different computer, at some point later to see the status. All of a sudden nobody needs a expensive laptop. (or a tiny, not so high performance one, is good enough)
  4. My laptop fried a couple months back, and I was forced to think about these things. I wasn't able to simply sit down at another computer in the office and get to all of my applications and data immediately. It took 24 hours. This was an unacceptable failure point in the process.

And yes, the GUI is rendered on the server and exported to the client. We obviously don't use this strategy for running something like Solidworks (a 3D solid modeler) as it has very high bandwidth from the application to the diplay. Most EE and business applications do not have high bandwidth GUIs. All of the bandwidth is used to process the data sets.

A Verizon EVDO cellular modem has high enough bandwidth to use our accounting software (Quickbooks Enterprise) which has a somewhat graphical intensive GUI, and exporting the display is much higher performance than querying the database server using a local copy of the software over this link (or even a hardwired LAN).

Things like JTAG emulation, device programming and debug, etc, run on clients. In the event that one wants to sit at his desk and his hardware is sitting in the lab, he can use terminal services with WinXP to run these applications. In the event that this computer melts down, it can be wiped clean, get a standard disk image, and he is up and running with the installation of a single application. Or simpler yet, in the mean time, he can grab another computer and install a single application.

Strategies like this start to make a lot of sense even if there are only two or three users of any given application, and there at least a few different applications used by any given person.

Regards, Erik.

--
Erik Widding
President
Birger Engineering, Inc.

 (mail) 100 Boylston St #1070; Boston, MA 02116
(voice) 617.695.9233
  (fax) 617.695.9234
  (web) http://www.birger.com
Reply to
Erik Widding

On a sunny day (Thu, 25 May 2006 17:45:18 +0100) it happened snipped-for-privacy@btinternet.com wrote in :

Somebody will do this, somebody who possibly has a financial interest. I for myself, me, would like to have that possibility, so I do not have to buy n packages (from n tool vendors), just open an account, use so much computing time, see credit card bill at end of month. And I would have top of the line tools. That graphics card thing you mention makes no sense to me, you are not talking about layout or something, the most detailed pics will be for the floor planner perhaps, hey, we can have google maps of the world, zoom in. This is the age of H264 hi def via Internet. 1920x1280 @25fps. Maybe you are still on a dialup, OK, for those it will work too, to send a 800x600 screen (all you really need) is about 11.5 Mbit for 8 bits full colour RGB, without any compression (800x600x3x8). When you zoom or move only the changes (the new edge[s] that come[s] in\ are needed. Add a bit compression, you do not need 8 bits RGB for floorplanner, graphs can go in gnu whatsitsname format, error messages are text, so is you listing. Hey I am working with video, this is kids stuff.

So, no tech limits, yes and as Sun was looking for projects for their server farms, FARM, more then one, comprendre? The soft could well split you up in several threads on several machines, would not worry too much about speed.

Add a bit of competition, and speed (= cost) would go up and price down. From a phylosofic pov, maybe we all want our own, our own music recordings, our own car, our own whatever. And maybe they want to sell everyone that 1000$ software tool, even if you use it only every now and then. But in some cases (like we do not all have our own power plant either), systems like I proposed make sense.

Reply to
Jan Panteltje

Good god,

perhaps you're suffer> > > Now, to do the synthesis, many people have to buy advanced (very fast)

well, for place and route you could certainly conceive of dedicated systems that could better tackle the problem. Any signifcant reduction in the time it takes to create bitfiles speeds up the design process and increases productivity. Plus you can avoid the wasting of potential compute power that you have in existing systems. It's more efficient and you have economies of scale too, so it ends up being cheaper for the user. Xilinx, being xilinx, could even look to the possibilities of using large scale reconfigurable computers to tackle the problem. Let me get this straight though, you believe that Xilinx have written software that deliberately disadvantages their customers by clogging up their machines? If that were so, wouldn't Altera have noticably better performing software, or are they in on it too?

OK, again you think that Xilinx are out to get you... If remote application delivery offers increased productivity at reduced risk and cost, then at least some companies will make the not-too-great leap of faith in trusting the company to deliver honestly on their promises. The companies that make the leap to a more productive, efficient and lower risk and cost process will be more profitable and the practice will become standard. When you concentrate computing, you can do a better job of reducing energy use in computation, and the incentive would certainly be there as power usage costs would be in the M$/year for remote application delivery systems. So, you see, it would be good for the environment too.

Why are you so mistrusting of Xilinx? They want to make a profit, and they won't do it by putting their customers out of business.

Civilisation itself is based on trust. You can put safeguards in place to prevent fraud and theft but at the end of the day everything we do is based on mutual trust. Think about when you walk into a bank and hand over a wad of notes to a complete stranger, someone who you've never seen before and may never see again. This doesn't seem to faze us because we, to the greater extent, trust the bank. In exchange for goods and services we're happy to accept bits of paper that correspond to nothing, printed off by the government as and when they feel like it, but we trust them to adopt a rational monetary policy and so we trust that we'll be able to exchange the money for goods and services in turn. Why stop at running your own software on your own OS? Why not make your own shoes in case Xilinx have bugged them too? Grow your own food, adopt your own currency for trading with other people, build your own house, I could go on...

I'm not saying power isn't abused, and that we shouldn't question those in a position of authority, but we should certainly have some perspective...

Robin

Reply to
Robin Bruce

You got that right :-). And even without the sources - just by the bitstream - the device manufacturer would have no trouble at all understanding the design. They have all the information about the chips they make, you know.

It boils down to one single thing: does the chip vendor have access to your programming data or not.

If you use a wintel PC the chip manufacturer typically has access to your data. Has had for over a decade, to be more precise. You may still be able to protect your data if you never connect the PC to a network - and make sure there is no wireless hardware beyond your control on it - and never exchange disks with other systems - etc.

Dimiter

------------------------------------------------------ Dimiter Popoff Transgalactic Instruments

formatting link

------------------------------------------------------

Ben J> > > > Now, to do the synthesis, many people have to buy advanced (very fast)

Reply to
dp

Jan Panteltje wrote

That was the system we had at Sun Labs when I worked there a few years ago. And very productive it was, despite not having the zillion PC programs I supposedly rely on.

The thin clients were activated by smart cards. When you had a problem you could take your smart card to the guru down the corridor and show her the exact screen with the problem on display. And when you wanted to work nearer home for a few hours you could take the smart card to a local Sun facility and pick up from exactly the point you left off.

Reply to
Tim

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.