cloud design flow

Hi there,

I was wondering if there's any one out there working with a cloud design flow which is tool agnostic.

The reason for asking is that I'm trying to evaluate if there's a way to integrate the benefits of a cloud service (with more hardware resources than our inhouse ones) into our design flow.

We clearly noticed that several times our simulations would take too much time and we prefer to go directly to the bench in order to verify a specific function. This approach forces us to rely on the bench availability (which is not always granted) and may not always be efficient (we need to set up special configurations which gives us greater observability/controllability on the bench).

Another main advantage is the design space exploration: having much more power available would result in the possibility to throw more cpus at a specific problem, hence addressing the issue faster.

Last but not least, regression testing and automated testing can be a great advantage of moving to the cloud. Any new module/component which is ready to be integrated in the overall design can be automatically included in the regression suite which is regularly running and can spot early integration issues.

So far I've heard/read about 'Plunify', which supports the Quartus software in the cloud, but we are working with Microsemi (former Actel) devices and there's nothing available out of the box.

Should we set up the environment the hard way, i.e. a set of scripts to handle the flow remotely? This might require a great initial effort but it has the benefit that we know what we are doing and we can fix it anytime an issue arises (provided that the share of time devoted to fixing problems doesn't eat designing time!). Or maybe there are tools out there, or services, which are already providing such environments?

Ideas/comments/suggestions are more than welcome. Cheers,

Al

--
A: Because it messes up the order in which people normally read text. 
Q: Why is top-posting such a bad thing? 
 Click to see the full signature
Reply to
alb
Loading thread data ...

[]

uhm, considering the amount of replies to this thread I have only two possibilities to choose from:

  1. nobody is using such paradigm in this community
  2. nobody using this paradigm is willing to share it with this community

In both cases I'm out of luck and guess that if I ever want to go down this path I'll be doing it on my own.

Reply to
alb

Hi Al,

It is not a paradigm, it is simply not a viable business model for most EDA companies. If you look at Plunify for example you will see they only managed to signed up 2 companies since 2011(?) and neither of them is any of the big 3.

In addition most companies prefer to keep their regression testing in-house. Splashing their crown jewels onto some cloud server in a foreign country is simply too risky. Also, you nearly always have to tweak your regression suite and doing it in the cloud might add an extra layer of hassle.

Regards, Hans

formatting link

Reply to
HT-Lab

[]

This is quite interesting indeed, but surprising as well. When it comes to high computation demand it seems so obvious to move to a cloud service. Actually before the buzzword 'cloud' jumped in it was called 'grid', where several farms/computer centers were connected together in order to increase throughput.

I honestly find this quite insane, the same companies are exchanging confidential information via much less secure means like emails, faxes or phone lines, without anyone bothering about encryption.

A two keys encryption mechanism would be safe enough to prevent any intrusion. But if you really want to get paranoid there are other technologies available like quantum criptography.

this I may take as an argument. Nevertheless I find it quite strange that each one should build its own computer farm to handle big regression suites with all the maintenance cost it goes with it.

Not only that, how many in-house solutions we have to develop and debug constantly? On top of that in-house solutions are rarely scalable unless a big effort is put in it from the very beginning, again requiring a large investment in the IT department, ending up with more network specialists than FPGA designers.

Reply to
alb

The problem is not so much getting the data to the cloud service securely. The problem is trusting the cloud company to protect the data once it is on their servers.

Since Snowden it is known that all the major companies that provide cloud services are constantly subjected to court orders forcing them to give the authorities access to pretty much anything that is stored on their servers or otherwise handled there, and to the information as to who is doing what with their services.

And since Snowden it is know that this is not only done if there are suscpicions of crime or terrorism, but also as a tool of corporate espionage (see

formatting link

So for any European company it is more or less out of the question to use cloud services for anything important anymore these days. Especially not cloud services that are offered by companies that are US-based. Not because they are evil, but because they are subject to US laws that force them to grant access and disclose pretty much anything.

So, cloud storage? Maybe, but only if data is encrypted properly BEFORE it is put there. Cloud computing for my simulations and synthesis? No way am I uploading my project data anywhere I don't have complete control over.

This may sound like paranoia, but it is not. It seems that nowadays every week there's a new revelation proving reality is even worse than what paranoid luncatics have had nightmares about for decades...

Fortunately, my designs are not big enough to require server farms, so for me personally cloud services are of no interest anyway.

Reply to
Sean Durkin

Hi,

My name is Harnhua; I'm one of Plunify's founders and I just wanted to chim e in on a much-loved topic.

@alb:

Naturally we agree wholeheartedly with what you said. If you are willing an d interested, let's have a chat and maybe we can propose a trial project wi th the vendor to implement the system you envisioned?

@Hans:

I respectfully disagree with your comments - we are proving that it is a vi able business model.

Do you wonder why the big 3 aren't moving to the cloud? They have tried, an d are still trying, but their current sales organizations are simply too de eply entrenched in their lucrative all-you-can-eat models to ever want to c hange. That's a big obstacle they have to overcome. Only the smaller EDA co mpanies see the benefits and are more agile -- we are working with a few wh o wish to remain anonymous for now.

No less than Joe Costello, former CEO of Cadence, has commented that in the end, only customers can lead the change, not the big 3. And people wonder why the EDA and chip design industry are perceived as "sunset industries" n owadays, steadily losing all the innovation we once had. It is because of t he fear of change.

Plunify is our effort to try to affect a change, but it is not change for t he sake of changing. More and more customers are seeing the benefits of off loading compute-intensive tasks to pay-by-the-hour datacenters. We've been inspired by seeing cases where an engineering team accomplished in 3 days t asks that would have taken them a whole month to do!

If you don't mind, may I understand more of your concerns with using a clou d service? That will help educate us on how we can improve.

@Sean:

I understand what you said about not having designs that are big enough to require a server farm, and agree that it is mostly trust, not (merely) tech nology that is the issue with entrusting a 3rd-party in another country wit h your confidential data.

The point I'd like to make is, why judge us before you even get to know us? One of our first suggestions to customers is to NOT share their latest and greatest designs if they don't feel good doing so, and just offload their tedious tasks on stable designs, whatever they are comfortable with. For th e record, we do encrypt your data before it is stored, if you want to store it. We are Singapore-based, and you can choose European servers if you are concerned about your data falling under US jurisdiction. And we're regular engineers and software developers. Do let's have a chat if you're interest ed in finding out more.

We spend all our time making sure that our platform is secure and that supp orted design flows function smoothly at scale. This is what's putting bread on the table so you can be sure that we will move heaven and earth to make our stuff work as customers'd expect it to. As cheesy as it might sound, t he belief that Plunify can contribute to our industry as a whole also keeps our team motivated.

Cheers, Harnhua

Reply to
harnhua

good that you guys are following this group. I wish more people from EDA industry /lurked around/ here instead of trenching themselves behind their user-forums.

[]

maybe they do not see their share in the business model you are presenting and therefore are simply not interested. What you present as a win-win solution might not be regarded as such by them.

What in your opinion is their advantage in letting you get in this business as a gateaway to their tools? EDA industry has always worked on the principle to /federate/ users [1] with their tools to provide them what they need to ship a product early and cheaply.

Now comes your idea that says: 'to hell with their tools, we offer ours on *top* of them and the user does no longer need to deal with X or A suites'. Wouldn't this approach scare them?

Or maybe because they do not see where is their share in this change.

As an end user I'd say that a pay-by-the-hour is the best solution, but what is the tools vendors gain with this model?

The point is not only trust, rather liability. Who would be liable for a leak of information? And how much would that leak cost to the customer as well as the provider?

[]

Problem with encrypting storage is where the decryption key is stored. Can you imagine how catastrophic would be a scenario where you *lost* the decryption key or it has been forged somehow?

And regarding secure transmission I hope you guys are aware of this:

formatting link

[]

there's only one way to make sure your platform is secure: make it open. This of course might not be a viable solution for you, but that is the only solution that would help you build trust and even then, the heartbleed bug I referred to earlier is a clear example where security might be a showstopper for your model.

Saying you have put a lot of effort is *all relative*. No matter how skilled are your network experts and software developers, only a large user base and a large amount of 'eyes' can guarantee a level of trust sufficient to migrate on the cloud.

Be sure though that I second your attempt to provide such a service. Even though I would be more keen to support you if you were adopting an open standard like OpenStack instead of the AWS.

p.s.: I suggest next time, for readability purposes, you quote text from others in order to avoid the need to switch back and forth to understand what they have said. This is what 'quoting' is for.

Reply to
alb

By "big 3" Harnhua probable meant Synopsys, Cadence and Mentor Graphics.

Reply to
already5chosen

I am glad this is the case, I have nothing against cloud based services and I always applaud new startup doing something new. However, you must agree with me that having just 2 tool vendors on-board after 4(?) years gives the impression you are not doing so well. You show ISE in your demo but don't list them on the tools page?

I also wondered why you are going after the FPGA market were large regression test are less common.

I would say it is an obstacle you have to overcome. The EDA vendor is not going to change their pricing model if they get less money. Most EDA vendors have a flexible license scheme, you can get 1 or 3 month term licenses from most of them. Setting up a small in-house regression farm (assuming FPGA user) is also not a major undertaking. It basically all comes down to how many simulation licenses you willing to pay for and not the setup.

But unfortunately for the big clients you need the big EDA tools. Nobody is going to sign-off their $10M ASIC on a new unproven tool. Also the amount of investment and know-how required to enter this market is enormous and even if you have something good you will be snapped up by one of the big three so you are back to square one.

I would put some pressure on them to endorse you.

I would say its al about $$$

I have no doubt that cloud services will become more and more popular. It is just that for the moment I don't see a viable business model especially if you are going after the FPGA community.

My Mentor Precision tool has a feature called Precise-Explore which enables me start 16 simultaneous synthesis and P&R runs all from the comfort of my desktop

I hope I am not too negative and I would be happy if you would prove me wrong,

Good luck with your business,

Hans

formatting link

Reply to
HT-Lab

Hi alb,

Our focus is actually on FPGA design, and on making all the extra resources in grid/cloud computing produce better quality of results in a shorter tim e -- that I think is the main attraction for people to consider an external compute resource.

Therefore my view of EDA is more generic in nature. Other much more qualifi ed people have expressed their opinions on the current state of the industr y, but it seems like more and more end-customers are unhappy with EDA tool sales/licensing models, and have been for a while. But it's gotten to the p oint where the industry itself appears to suffer--investment money has drie d up, the number of "big exits" is declining, fewer and fewer EDA startups are being created, and so on.

You're probably right. I think they must see problems, but at the same time , are perhaps searching for the "right models" to remain as profitable as b efore?

at

On the contrary, we'd like customers to use the current FPGA tools as is, a nd only offload compute-intensive processes to a grid/cloud computing envir onment that we help set up. This is so that builds can be scaled up instant ly in a guided way to get to design goals. Something like suddenly having "

1000 more servers trying strategies to get to timing closure in a day inste ad of in weeks." We develop plugins, not replacements.

Yes, in their own way, large as they are, the big EDA 3 face many challenge s. Opening up the industry, as you mentioned, is great for the end-user but it must work for the vendors too. For instance, every vendor seems to want an ecosystem, but only around their tools...

Potentially, a great deal more users than before is my claim--the so-called "long tail". They can still charge high hourly prices, and have been doing so for the large customers. Although tool purchase is far from the major c ost component of making a chip, it does affect the entire workflow. Further more, the population of dabblers, hobbyists, students who will eventually r eplenish and sustain the industry will benefit from pay-as-you-use models a nd portability. I'm not saying anything new here, am I? ; )

As you mentioned earlier, people routinely FTP their files to the foundries . Who is liable for a leak in those cases, I wonder? (I don't know) I think a good approach is what companies like OneSpin are doing--make online tool s that don't require users to transfer source code.

You're describing the kind of technical problem for which I think there is a infinite loop of questions and answers. For example, to get a decryption key, one would have to break into the system first. To break into a system, one would have to first find the system. Security vulnerabilities are pres ent in company networks as well as in Amazon. My (biased) view is that comp anies like AWS who rely on the cloud for a living have the most incentive t o make their systems secure. Of course we patched ours as soon as we found out about vulnerabilities like the one you posted. This is also probably wh ere proper customer education and IT audits come into play. Doubters will a lways doubt--there will be people who will never contemplate using a cloud service for chip design, but I think with the proper communication and tech nology, more and more companies will turn to a "cloud" way of designing.

I like your statement very much. Ideally, tools can export non-reversible d ata so that users don't have to upload source code.

True; it's a chicken-and-egg situation which we can address only by working with customers.

Thank you for your thoughts and feedback!

Done!

Cheers, Harnhua

Reply to
harnhua

Hi Hans,

Thank you for your thoughts!

Absolutely agree with you. Building a good EDA platform is an uphill task, and there are so many factors involved to make it successful. So far, what' s been mentioned are vendor relationships/motivations, end-user requirement s and security. Another one is that the tools themselves are not designed t o take advantage of a cloud-like infrastructure. Over the years, engineers have learned to overcome design problems through other means, because not e veryone has access to compute resources. Even with ample servers, the perce ived gain that the tools bring aren't outweighing the perceived costs, if t hat makes sense.

So to build a good FPGA/EDA problem, we just have to chip away bit by bit a t the different problems. Some flows we have in beta, some we can and canno t release, so please bear with us and work with us if you have a demand for specific tools and flows.

Because we believe that we can use existing FPGA tools to get better qualit y of results, on top of the "convenience" benefits of a platform.

: ) Yes, I concur what you wrote here for what I tend to call the "convenience" benefits.

This, I think, has some implications on the ASICs vs FPGAs question and is part of the challenge that small FPGA/EDA startups (including us) have to s olve. However it doesn't have to be "small company versus big company" thin g. It sounds a bit too idealistic perhaps, but I really think all vendors a nd the customers need to work together to address this, so that the amount of $ to be made can grow as a whole.

From the infrastructure perspective, it's great that you have the hardware and licenses to do this, and it's great if this is all that you need -- the se are the "convenience" benefits that I mentioned above.

From a quality of results point of view, what if I say that by running our tool from your desktop to run 16 simultaneous synthesis and P&R runs in you r server farm / local PC / AWS, you can actually improve your synthesis and P&R results? Is that a better value proposition from your perspective? (That's what we're working on.)

Thank you!

Cheers, Harnhua

to require a server farm, and agree that it is mostly trust, not (merely) technology that is the issue with entrusting a 3rd-party in another country with your confidential data.

us? One of our first suggestions to customers is to NOT share their latest and greatest designs if they don't feel good doing so, and just offload th eir tedious tasks on stable designs, whatever they are comfortable with. Fo r the record, we do encrypt your data before it is stored, if you want to s tore it. We are Singapore-based, and you can choose European servers if you are concerned about your data falling under US jurisdiction. And we're reg ular engineers and software developers. Do let's have a chat if you're inte rested in finding out more.

supported design flows function smoothly at scale. This is what's putting b read on the table so you can be sure that we will move heaven and earth to make our stuff work as customers'd expect it to. As cheesy as it might soun d, the belief that Plunify can contribute to our industry as a whole also k eeps our team motivated.

Reply to
harnhua

[]

There might be some technical limitation with some tools. For example I seem to be stuck with using only one core for Designer (par tool from Microsemi) and virtualizing 1000 cores wouldn't do much, unless I'm running 1000 parallel processes with 1000 different sets of parameters...

I'm curious about this, I did not know tools like those can be extended with plug-ins. Do such feature show on all tools? Can you really plug-in estentions to simulators, synthesizers, etc.?

[]

Because their lock-in strategies come from a distorted view of profit. Hystory has already shown what are the consequences of such strategies (see 'Unix wars'). Unfortunately proprietary hardware will always be a fertile ground for proprietary software and lock-in mechanisms that choke creativity and progress.

Dabblers, hobbysts, students will not throw a dime in this business. There exists already enough 'free and open' tools that can let people play around enough to get their projetc/toy/hobby done, and EDAs are providing free of charge licenses to let them play.

I've always been curious to understand what are the differences between the software community and the hardware one, and why we haven't yet gone through the same need to share, contribute and collaborate the same way the software guys do. In the end my conclusion is that you are always stuck with a proprietary tool, even if you can download it free of charge and use it without paying.

[]

To get a decryption key you do not need to get in, it is sufficient that somebody 'gets out' with it. It is known since long time that secrecy does not guarantee security and here is an excerpt from the pgp FAQ:

Q: Can the NSA crack PGP (or RSA, DSS, IDEA, 3DES,...)?

A: This question has been asked many times. If the NSA were able to crack RSA or any of the other well known cryptographic algorithms, you would probably never hear about it from them. Now that RSA and the other algorithms are very widely used, it would be a very closely guarded secret.

The best defense against this is the fact the algorithms are known worldwide. There are many competent mathematicians and cryptographers outside the NSA and there is much research being done in the field right now. If any of them were to discover a hole in one of the algorithms, I'm sure that we would hear about it from them via a paper in one of the cryptography conferences.

This shift may only be driven if big EDAs are willing to let it happen. That means users *and* new enterpreneurs are tied to a business model led by somebody else. Compilers vendors back in the late 80s where facing a dramatic shift because a competitive and much more powerful tool was distributed free of charge and free of lock-ins (gcc) and there were companies making a business out of it (Cygnus).

Nowadays there cannot be a 'one man show' like those days and I agree that is unlikely that someone will start to write an open and free synthesis tool, but without it our community is bound to the ups and downs of those big guys who are leading the show.

If you limit your verification to your customers than you're likely going to fail. There are tons of incredibly gifted geeks out there and if you do not open your systems to peer review, than you will never leverage that power.

Your code will always be *yours* because you know every bit of it, not because you put a patent on it or made it close to the world.

Al

Reply to
alb

Because it is (1) possible and (2) necessary.

1) in some cases it is easy to have a dongle without obnoxious DRM: the hardware itself

2) inside the hardware there is a lot of highly commercially sensitive information that does not need to be (and is not) visible to the customer.

3) the internal structures and performance are imperfectly understood and modelled. That's a significant problem for the manufacturer, infeasible if you are trying to keep third parties aligned.

4) the up-front cost-of-entry (non-recurring engineering) charges are prohibitive. Starting small in a garage and then having incremental improvements is no longer an option.

Reply to
Tom Gardner

I'm confused. Are you saying that 'share, contribute and collaborate is possible and necessary'? Then what are you trying to say with your following points?

I don't get this point.

There's a lot of highly commercially sensitive information in your latest i7 or whatever is the processor you are mounting on your pc, but it does not prevent you from running a completely free tool to build software on it.

sorry but I fail to understand this point as well.

I'm certainly not dreaming about that. But I believe there are enough resources and capabilities, if joint together, that can make a big change.

Reply to
alb

Sorry! In the hardware world it is possible and necessary to avoid collaborating and sharing w.r.t. the internals of many devices, particularly with FPGAs

That's the "it is possible to avoid..." part.

They have an extremely comprehensive specification of what you can rely on seen from the outside. You can't modify anything on the inside, for many good reasons.

That point is clear, provided that you have some concept of semiconductor modelling.

You need detailed plans as well as dreams. The devil is, as usual, in the details.

Reply to
Tom Gardner

Hi Tom, Tom Gardner wrote: []

I'm not advocating to share your crown jewels, I'm only advocating for standards which allows X and Y to build their FPGA while W and Z build the tool freely. In order to produce perfectly suitable machine code I do not need to know how the clock is routed in your cpu, neither how you optimized the carry chain, I simply need to know which are the opcodes and the registers it uses.

If you want to make a very rough parallel with the FPGA world, I do not need to know more than the logic provided by a logical cell (which is documented on the datasheet) and the associated delays, which is also reported on the datasheet. [1]

A tool that is free as in *libre*, can be a turning point in opening up our community to a much more effective exchange, with a set of tools which are constantly growing and increasing design efficiency.

[]

and what does prevent us from having a similar /comprehensive specification/ for fpga?

Uhm, I thought I was lacking some *basic* information about /semiconductor modelling/ therefore I googled it and Wikipedia reports:

Internal structure are difficult to model, so what? Do you think my synthesis tool need to know the physics of my device? Or can it simply rely on 'inaccurate but good enough models' of logic elements and delays?

What are the 'third parties' that need to be 'aligned' (to what then?).

You do not need a detailed plan, you need motivated and talented people. Often you either lack the motivation or the skills.

Al

[1] I may not have all the insight of what does it take to create a synthesis tool and I'd be happy to hear it.
Reply to
alb

You almost certainly do need to know more, unless you are just making a toy tool or one vastly simplified so that it is best suited for educational purposes.

Start by considering wire delays, routing constraints, simultaneous switching transients, and many more non-ideal aspects of operation that need to be considered in modern FPGAs. Look at a vendor's tool to see the type of constraints that have to be specified by the user in order to have a reproducible implementation. Use a vendor's tool to see the type of warnings that the tool emits.

Have a look at the Xilinx documentation for just one of its families. Download their "Documentation Navigator-2013.4 Utilities" for their Vivado tool.

Go ahead and do it. It would help if you have already completed several designs, since then you will know the pain points that designers would like removed. There's little point in inventing yet another way of doing something that is already easy to do.

See my point 2. If you don't understand that then it indicates you haven't implemented a design for a modern FPGA.

Browse the documents in Xilinx's "Documentation Navigator-2013.4 Utilities"; it might enlighten you.

It needs to model them in sufficient detail to be able to get good reliable predictions and implementations.

Not sufficient.

People like you, to the secret internal non-idealities.

Such people are necessary but not sufficient.

If you don't have a plan then , by definition, you don't know what you are going to do - let alone how to do it.

Reply to
Tom Gardner

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.