Makefile or not?

I definitely second the "students on summer jobs" opinion. Over the years I've seen a lot of sample/library code from silicon vendors and most of it was truly awful. It was often clearly written by somebody who didn't have a working knowledge of either the hardware or the language they were using. Sometimes it just plain didn't work, but since the authors obviously didn't understand what the hardware was actually supposed to do, they had no way of knowing that.

In my experience, trying to use anything from silicon vendors beyond the header files with register addresses/structures has always been a complete waste of time.

--
Grant Edwards               grant.b.edwards        Yow! 
                                  at               BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI-BI- 
                              gmail.com
Reply to
Grant Edwards
Loading thread data ...

That's been our experience too. We reported a bug (with included fix) in a particular vendor's module, and their response was not to fix the bug but to delete the module from their portfolio. Then a few years later the module reappeared - with the bug still present.

Theo

Reply to
Theo Markettos

I'll ask my colleague who's doing most of the work on that.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

I've used it, circa 2006-7, and for my application (highly multithreaded

3D electromagnetic simulation on a SMP) it was amazing--it blew the doors off both Visual C++ and gcc under cygwin. (For sufficiently permissive values of 'amazing', that is, i.e. 1.5-1.8x on the same hardware.) ;)

In defence of Eclipse, it does do a much better job of humanizing gdb than the other things I've used, such as ddd.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Ah, I remember ddd from decades back. It always seemed about 70% finished.

Is it still a thing?

... there appears to be a Gentoo ebuild:

# emerge --search ddd [ Results for search key : ddd ] Searching...

  • dev-util/ddd Latest version available: 3.3.12-r4 Latest version installed: [ Not Installed ] Size of files: 5,554 KiB Homepage:
    formatting link
    Description: Graphical front-end for command-line debuggers License: GPL-3 LGPL-3 FDL-1.1
[ Applications found : 1 ]

However, it looks like 3.3.12 was released 10 almost years ago.

The TCL/Tk based GUI that sort of "came with" gdb for a while about

10-15 years ago wasn't too bad (can't remember it's name). But, for the most part I prefer the gdb command line -- though I sometimes use gdb-mode in emacs.

Most of the embedded stuff I work on isn't amenable to interactive breakpoint/step/examine/resume type debugging anyway. Milliseconds after you hit a breakpoint, all sorts of hardware and protocols will start to timeout, overflow, underflow, and generaly get upset. Once you stop, you can't expect to step/resume and get useful behavior.

Non-embedded stuff I do in Python, and don't need a debugger. ;)

--
Grant
Reply to
Grant Edwards

Intel's C (and C++) compiler is still very much a major choice for the x86 platform, with good support for the latest standards and a fair degree of gcc compatibility (inline assembly format, attributes, etc.). It is generally considered to be the best choice for automatic vector SIMD code generation, and has support for parallelising code using multiple threads. But it is also well known for making code that is particularly poor on non-Intel x86 processors.

Agreed - Eclipse + gdb is a perfectly solid debugger. It is not always perfect, but no debugger I have ever used is always reliable or works as you expect. Certainly it is fine for most debugging purposes.

In the past, I have used both ddd and gvd (which later became part of gps, the GNAT Programming Studio) as front ends. There are plenty of other gdb front-ends available - those with a strong sense of irony might like to try using MS Visual Studio.

Reply to
David Brown

Actually it was an excellent compiler: it was the absolute best for highly optimized code through the 80's and 90's. It was, however, persnickety and infamous for its barely decipherable errors and warnings. I think the word "difficult" sums it up.

Intel's compiler STILL is the best on x86 for floating point and for auto vectorizing to use SIMD. In recent years GCC has taken the lead for integer code.

For a long time Microsoft acknowledged that Intel's compiler was superior: it is an open secret that Windows itself through NT was built using Intel's tool chain, and that the OS kernel continued to be built using Intel up through XP. Vista was the first Windows built entirely on Microsoft's own tool chain.

Through several major versions, Visual Studio included a configuration switch that directed it to use Intel's tools rather than Microsoft's. This always was possible anyway using foreign tool settings, but for a long time Intel's tools were supported directly.

George

Reply to
George Neuner

How about for FPGAs? ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

I spent some time working with a NIOS2 core on an Altera Cyclone-something-or-other. In the beginning, somebody got conned into using the Altera tools for doing software development. As expected, they were horrendous. It was Eclipse with a bunch of plugins.

IIRC, there were Eclipse plugins that called scripts written in bash that called Perl scripts that called Java apps that generated TCL that got fed to other scripts that generated header files... and on and on and on. The tools required more RAM that most of our development machines had. And it appeared to re-generate everything from scratch everytime you wanted to build anything.

After fighting what that for a few months we threw it all out and started from scratch with the gnu toolchain, makefiles, and our own header files we wrote with info gleaned from the above mess.

There was also some sort of gdb-server executable that we extracted from deep within the bowels of of the Altera IDE. We had to write some sort of wrapper for that to get it to run stand-alone and talk to the USB byte-blaster thingy.

Once we ditched the massive pile of Altera's garbage IDE, things went much smoother. [Until, as the project neared completion, it became obvious that the performance of the NIOS2 was nowhere near what was promised, and the whole thing was abandoned.]

The hardware guys were, of course, chained to the Altera VHDL IDE software for the duration -- presumably for heinous sins committed in a previous life.

--
Grant Edwards               grant.b.edwards        Yow! Kids, don't gross me 
                                  at               off ... "Adventures with 
                              gmail.com            MENTAL HYGIENE" can be 
                                                   carried too FAR!
Reply to
Grant Edwards

Nah, you can get out of the IDE there too. You wind up having to write Makefiles that write and call Tcl scripts that communicate with a jtag-server executable that you extract from deep within the bowels of the IDE. It's deeply unpleasant, and still preferable for production code to using the IDE.

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

Can you avoid using the IDE to compile the VHDL and build the various formats of bitstream files?

--
Grant Edwards               grant.b.edwards        Yow! My NOSE is NUMB! 
                                  at                
                              gmail.com
Reply to
Grant Edwards

Mostly. You (practically) have to use the IDE to configure the settings file, the .qsf, which tells it what bitstreams to make, what the sources files are, etc. Once that file is correct (and it's text, so it's VCSable), you can just run make.

See below, one of my team's Quartus makefiles. We're doing the same in Xilinx Vivado these days, which was again a tedious and awful process to get going. I have no idea why no FPGA vendor believes that repeatable build control is something that matters to their customer base; left to my own devices we'd be doing CI on the version control server.

########################################################################

# This is the makefile to build 22C230B, the FPGA for the V230 Analog

# Input Module. It builds rbf 22C230B.rbf to be used by the 22E230

# for programming on an EP3C5F256.

#

# The default target builds the necessary image. Other targets are:

# reg_map : Builds the register map files.

# clean : Removes all build products.

#

# Karla Vega, Highland Technology, Inc.

# 29-May-2014

########################################################################

########################################################################

# Tools and binary locations

########################################################################

SHELL ?= /bin/bash

QUARTUS ?= $(QUARTUS_ROOTDIR)

IS_CYGWIN := $(findstring CYGWIN,$(shell uname))

ifneq "" "$(IS_CYGWIN)"

QUARTUS_BIN := $(shell cygpath "$(QUARTUS)")/bin

else

QUARTUS_BIN := $(QUARTUS)/bin

endif

quartus_map := $(QUARTUS_BIN)/quartus_map

quartus_fit := $(QUARTUS_BIN)/quartus_fit

quartus_asm := $(QUARTUS_BIN)/quartus_asm

quartus_sta := $(QUARTUS_BIN)/quartus_sta

quartus_sh := $(QUARTUS_BIN)/quartus_sh

########################################################################

# Project configuration.

########################################################################

PROJECT := 22C230

REV := B

DRAFT := 0

DEVICE_FAMILY := "Cyclone III"

DEVICE := EP3C5F256

DEVICE_SPEEDGRADE := 8

FINAL := 22C230$(REV)$(filter-out 0,$(DRAFT))

# Bring in sources.mk, which is autogenerated from the Quartus project file.

ifeq "$(findstring $(MAKECMDGOALS),clean)" ""

include sources.mk

endif

ASSIGNMENT_FILES = $(PROJECT).qpf $(PROJECT).qsf

OUTPUT_DIR := output

CORE_DIR := src/cores

REG_MAP_DIR := src/reg_map

VHDL_DIR := src/vhdl

# Composite source list

SOURCES = $(22C230_SOURCES)

# Destination list

RBF_FILE := $(OUTPUT_DIR)/$(FINAL).rbf

########################################################################

# Phony targets

########################################################################

.PHONY: all reg_map

all: $(ROM_FILE) $(OUTPUT_DIR)/$(PROJECT).asm.rpt $(OUTPUT_DIR)/$(PROJECT).sta.rpt

clean:

rm -rf db incremental_db $(OUTPUT_DIR) *.chg sources.mk

########################################################################

# Rules

########################################################################

# No implicit rules, they won't do us any good.

.SUFFIXES:

# Quartus forces us to keep the list of all the source files in the .qsf

# file. In the interest of avoiding redundancy, we have a Tcl script to

# rip these out and turn them into the sources.mk file that we include

# earlier on. Therefore, if the .qsf file changes, or if the SOPC_TARGET

# needs rebuilding (which changes the .qip file which changes the file

# dependency list), we'll rebuild the sources.mk file.

#

sources.mk: $(ASSIGNMENT_FILES)

$(quartus_sh) -t list_files.tcl $(PROJECT)

# Quartus build process from the --help=makefiles option.

STAMP := echo done >

$(OUTPUT_DIR)/$(PROJECT).map.rpt: map.chg $(SOURCES)

$(quartus_map) $(MAP_ARGS) $(PROJECT)

$(STAMP) fit.chg

$(OUTPUT_DIR)/$(PROJECT).fit.rpt: fit.chg $(OUTPUT_DIR)/$(PROJECT).map.rpt

$(quartus_fit) $(FIT_ARGS) $(PROJECT)

$(STAMP) asm.chg

$(STAMP) sta.chg

$(OUTPUT_DIR)/$(PROJECT).asm.rpt $(RBF_FILE): asm.chg $(OUTPUT_DIR)/$(PROJECT).fit.rpt

$(quartus_asm) $(ASM_ARGS) $(PROJECT)

$(OUTPUT_DIR)/$(PROJECT).sta.rpt: sta.chg $(OUTPUT_DIR)/$(PROJECT).fit.rpt

$(quartus_sta) $(STA_ARGS) $(PROJECT)

$(OUTPUT_DIR)/smart.log: $(ASSIGNMENT_FILES)

$(quartus_sh) --determine_smart_action $(PROJECT) > $(OUTPUT_DIR)/smart.log

###################################################################

# Project initialization

###################################################################

map.chg:

$(STAMP) map.chg

fit.chg:

$(STAMP) fit.chg

sta.chg:

$(STAMP) sta.chg

asm.chg:

$(STAMP) asm.chg

-include local.mk

And the referenced list_files.tcl

# list_files.tcl # # Extracts all of the source file names from the Quartus project # settings file, and writes them into sources.mk so that they can # be pulled into the makefile. # # Rob Gaddi, Highland Technology. # 15-May-2013

set source_files {VHDL_FILE VERILOG_FILE QIP_FILE}

set projname [lindex $argv 0] project_open $projname

set hOut [open {sources.mk} {WRONLY CREAT}] puts -nonewline $hOut "[string toupper $projname]_SOURCES := "

foreach ftype $source_files { foreach_in_collection dat [get_all_global_assignments -name $ftype] { set fn [lindex $dat 2] puts $hOut "$fn \\" } } puts $hOut "" close $hOut project_close

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

In related news,

formatting link
seems to have come along quite a way since the last time I looked in on it. Might have to give it a try out on my next project.

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

We have a fully Makefile-based FPGA toolchain, which is critical for continuous integration builds, but generally projects are begun and tweaked from the GUI - typically it's not a one-way street (so you can open the project files used by the Makefile build in the GUI). While in principle all the tools can be driven from tcl, by the time you've worked out the hundred tcl statements you needed you might as well have used the GUI.

I had a play with hdlmake as we have increasing need to do Intel and Xilinx builds from the same codebase. It handles some basic stuff, like pin assignments, but anything of complexity (eg instantiating vendor IP cores) are going to need the vendor tools. hdlmake does avoid having to know the incantations to call Intel/Xilinx/etc parts of the build system and replaces with a single command, but that's not the biggest problem.

(my current issue is Xilinx IP Integrator's idea of schematic capture from the 1980s, complete with a mush of overlapping wires, and am trying to work out whether I can build complex SoCs entirely from tcl - in this case I think the GUI is so awful anything is better)

Theo

Reply to
Theo Markettos

If you think that is fun, just imagine doing it on Windows - with all the TCL and perl running under Cygwin.

It is a long time since I used the Nios, and I only did so very briefly (the project was cancelled for many reasons). But I seem to remember there being a lot of extra building going on due to the interaction between the software and the hardware. On the one side, the software for the Nios was made into a ROM component for the FPGA design, and thus meant at least a partial FPGA rebuild (and incremental builds were only in the expensive version of the tools, not the free ones). On the other side, a build in the FPGA side could mean changes to the automatically generated include files for the peripheral registers and addresses, triggering a software rebuild.

But it certainly /was/ possible to separate software and hardware development. Typically you only have such tight integration for a small part of the software - a boot rom - that sets up memory and loads the real program from external flash. That program can be developed independently. (And again, separate makefiles are more efficient - but the IDE with plugins can make debugging nicer.)

It is also worth noting that Eclipse has got far better since the early days of the NIOS2. It used to be a serious memory and processor hog, with few features to justify the weight. These days it still takes a fair chunk of memory, but it is a far lower fraction of the typical workstation. (My main Linux system generally has at least three or four distinct instances of Eclipse running at any one time, in different workspaces, for different projects.) And I find it to be the best choice for bigger projects in C, C++, and Python - as well as convenient for LaTeX and other coding. (But with external makefiles!).

Reply to
David Brown

I actually like the graphical interface for putting complex top-level blocks together (at least until VHDL-2018 comes out with interfaces), and you can make it write bad but sufficient Tcl that you can lock down for CI.

But have you run into the fact yet that, while the synthesis engine supports VHDL-2008, IP Integrator doesn't. You can't even write a thin wrapper, any VHDL-2008 anywhere in your design poisons the whole thing such that IPI can't work with it.

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

We have a fairly straightforward build process (makefiles, and TCL) for our Xilinx FPGAs using non-project mode TCL. We do nightly builds on all our FPGAs - the current build list is ~40 FPGAs.

For Xilinx IP we struggle up front to use the F#@#$F IP integrator or other GUIs to generate an example project. Then we reverse engineer the RTL that's usually under the covers, and use that directly. Everything else is thrown out. We've typed up a "Just the RTL" document which we've given to Xilinx to explain why we do this.

After the time spent on the up-front reverse engineering, things work fine - never having to open the darned Xilinx IDE again.

This thread gives make me nod my head (in a misery-loves-company sort of way) in that I see you software folks are basically doing the same thing.

The absolute WORST part of the Xilinx flows is in their MPSoC designs and configuring boot-loaders and rootfs images. Here one must use their awful 80s style schematic capture code to configure the bootloader, and initial images. Yes, schematic capture to design software.

They even crypto-sign the intermediate files (HDF) to prevent engineers from trying to create a more sane flow. Absolute insanity...

Regards,

Mark

Reply to
gtwrek

That aspect is useful, however the idea that inputs go on the left and outptus on the right is a braindead hangover from analogue schematics. Typically a module has several interfaces - eg a bridge has an AXI slave, its clock and reset, and an AXI master with its clock and reset. That's two groups of each of AXI/clock/reset. So why put the clock/reset inputs on the left and the associated AXI master on the right? Why are there always wires crossing from one side of the component to the other?

It's fine on a small design, but it's a nightmare on a complicate design. My current Intel Qsys design has about 40 components (mostly a lot of bridges of various kinds) in 4 levels of hierarchy, which would be a complete mess to represent in a inputs=left, outputs=right fashion.

I'm mostly composing generated IP, so avoid at least this problem...

Theo

Reply to
Theo Markettos

It wasn't just any Eclipse, it was a fork of Eclipse from 2005. Eclipse itself got a lot better, Altera's didn't.

I inherited a teaching lab which used Altera Eclipse on NIOS2, but I'd find I'd always have to revert to the command line to work out what was actually going on. When I rewrote the lab (and we moved away from NIOS to RISC-V), I junked the IDE and went with terminals and Makefile-based development - on the basis that it's something that students should be exposed to at some point in their careers, and it makes debugging their code a lot more sane from our point of via. They still drive Quartus via the GUI (because students start not knowing what an FPGA is, and it's easier for them to understand what's happening via the GUI) but Modelsim they mostly drive through pre-supplied scripts, given Modelsim's non-intuitive GUI.

Theo

Reply to
Theo Markettos

Yes, manufacturers' IDE's used to be done that way. They'd take a fork of Eclipse and modify it to fit their uses. And that meant you always had an old version of Eclipse, and often one that didn't work with other useful plugins (such as for version control systems). It also often meant that you were stuck on Windows.

These days, they are invariably organised as plugins for standard Eclipse. That means that updates are much more regular - each release of the tools usually builds on a relatively new version of Eclipse.

Reply to
David Brown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.