Pi Hardware

I am an experienced programmer and electronics designer. Dealt with many different Intel controller hardware items as well as CP/M and Windows OS. Programmed in PLM (Intel's language)for controllers.

But now interested in Pi. So recommendations for advanced hardware appreciated. Not just the Pi, but other peripherals and hardware (cases etc). Will probably build many different kinds of projects.

No baby steps please. Advanced peripherals and hardware please.

Retirement is fun time ! (Or else !)

Reply to
Aioli
Loading thread data ...

On Wed, 9 Sep 2020 12:01:57 -0700, Aioli declaimed the following:

Problem is: any reply to your "advanced hardware" and "peripherals" is dependent upon WHAT you are building.

After all -- for someone wanting to replace a Windows desktop... An R-Pi 4B 8GB, an HDMI monitor, USB keyboard & mouse (I'd recommend saving a USB port by using something like a Logitech "Unifying" wireless keyboard and mouse), and a USB(3) disk drive (on which one mounts /home, /tmp, /var and a swap file -- all the stuff that gets lots of changes that would wear out a uSD card) qualifies, with some box to put the drive and R-Pi into...

For someone trying to build a programmable Christmas tree lighting set, a couple of long strings of NeoPixel LEDs, a big power supply (the LEDs want power), almost any R-Pi with WiFi, and a case to hold the R-Pi and power supply -- and an ability to code a simple web server application for controlling the LED sequencer program. Connect to R-Pi over WiFi to access LED configuration using a browser in a phone or tablet.

If you are doing projects that require measuring analog data (an oscilloscope perhaps) you will need a dedicated multi-port ADC chip (the R-Pi does not have on-board ADC -- unlike the Beaglebone Black). You may also want multi-line PWM chips as the R-Pi is a bit limited in that aspect too (software PWM is rate limited, and CPU heavy)

--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber

Perhaps you might consider entering PiWars to give some stimulus for challenges & then how you achieve them is up to you (over-engineering seems to be re-regueur )

formatting link

--
[A computer is] like an Old Testament god, with a lot of rules and no  
mercy. 
		-- Joseph Campbell
Reply to
alister

Well I bought a high quality DAC and built myself a hifi internet radio that also can play any CD I have ripped to my server.

It's controlled by any web browser.

I have a yen to make a weather station as well, one day.

--
?The ultimate result of shielding men from the effects of folly is to  
fill the world with fools.? 

Herbert Spencer
Reply to
The Natural Philosopher

... and if you want lots of digital and analogue i/o without adding too many bits and pieces, get a BeagleBone Black! :-)

(I have several Pis and BBBs, horses for courses)

--
Chris Green
Reply to
Chris Green

I have a Davis Vantage Vue weather station (a remote sensor unit which measures wind speed/direction, rainfall amount/rate, outside temperature and humidity) and communicates these to a base station by radio link (proprietary, not wifi). The base station has a USB output.

I run Cumulus software (free download) on my Pi to log this data, draw graphs, record the data and any extremes, and upload to a web site. Previously I ran Cumulus on my Windows PC, which meant leaving it on 24/7; I also used that PC for recording TV. Moving those functions to a Pi means the PC can be switched off when I'm not using it.

Some time, I will investigate uploading the data to an SQL database on the server so a PHP program can extract and display graphs of specified parameters over a specified interval of time.

Reply to
NY

I've got an Oregon Scientific weather station which can communicate with an Android app via Bluetooth LE. Someone reverse engineered the protocol, so I can use a Python program on a Pi to download temperature, humidity and pressure readings. I combine these with readings from Htu21d and BME280 I2C sensors on other Pi's which are around the house and shed, and log to an SQLite database.

I've got another Pi running nginx web server with uwsgi, so it can use a Python program (rather than PHP) to retrieve data from the SQL database and generate HTML+javascript to plot it using Google charts.

It wasn't too difficult to do, I can't really formally release it as open source as it uses bits of stuff pulled from various other projects, but I can give you some pointers if you like.

---druck

Reply to
druck

A nuclear power station?

formatting link

Theo

Reply to
Theo

Yes the only bit I haven't cracked is running a process on the Pi that uploads data to the web server (maybe in daily batches, maybe every 10 minutes when a new entry is added to the local log file on the Pi) so as to add it to the SQL table. Doing it manually is easy enough from cPanel accessed by web page (LOAD DATA INFILE from a file on the Pi or Windows PC) - I've done it for a big database of WWI soldiers' details that my parents gradually add to as they do research, so I save an Excel spreadsheet as a CSV file, wipe the SQL table of its existing data, and then upload the whole lot again (including additions and changes). It would be nice to find a way of automating this, preferably so it only uploaded data that it doesn't already have.

Extracting the data - selected fields between selected dates - and passing it to Google Charts is easy enough: I can use a variant of the PHP code that my parents' website uses to extract data, and I've already worked out a specimen Google Chart, though I wonder where I put that code...

It's so we can see graphs over a longer period of time than the last 48 hours that I've configured into Cumulus.

The hosting is on GoDaddy (which I always want to call Big Daddy!).

Reply to
NY

On Thu, 10 Sep 2020 10:33:01 +0100, Chris Green declaimed the following:

I think I'm up to 6 R-Pi (one running Pi-Star, the other is my HTTP server -- 2@3B, 2@3B+, 4B 2gb, 4B 4gb), only 2 BBB, an BB AI (not recommended for experimenting unless one is comfortable with editing/building device tree files -- the BBB has run-time pin-muxing, not the BB AI).

But I've also got 6 TIVA-C boards, 4 or 5 Arduino, 2 Metro, some ancient BASIC Stamps, and a few Propeller boards.

If you need lots of hardware timers -- the TIVA TM4C123 must be a winner. Six 64-bit and six 32-bit timers -- and each of those can be split into two half-width timers. ARM M4F core, 12-bit ADC. No OS overhead

In contrast, the Propeller has 8 simple cores, running in lockstep, and NO INTERRUPTS (the idea was that one would dedicate a core just to polling whatever would create an interrupt).

--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber

That sounds like a case for using a cron job. See "man cron" for how to manage crond, the cron daemon, and "man 5 crontab" for how to write a cron job script.

crond is a daemon process (i.e. started on boot and waits for stuff to do) that looks for shell scripts to execute, runs any that it finds, and e-mails results and/or errors to the user who submitted the job.

Cron jobs can be run once every hour, day, week or month by putting a script in the appropriate directory, e.g scripts in /etc/cron.daily are run once a day, typically somewhat after midnight. In these cases the script contains details of which user it should be run under and where to send any data written to stdout or stderr, with the default being to run it under root and to send output to root.

Jobs with more complex or frequent run timing requirements are put in /etc/cron.d and contain one or more lines, each defining what to run and when. Here's one of mine:

===================================================================== # # getmail is run every 10 minutes # =============================== # Results are sent to root # SHELL=/bin/bash MAILTO=root

1,11,21,31,41,51 0-23 * * * root /usr/local/bin/getmail.cronscript

=====================================================================

This uses /bin/bash to run the /usr/local/bin/getmail.cronscript shell script under the root user and e-mails any output, which would be error reports, to 'root'. It is run every 10 minutes at 1,11.23.31.41 and 51 minutes after the hour.

--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

Yes I'd already identified that cron would be a good way of triggering the upload. It's a matter of working out what can be run (via cron) at the Pi which makes a remote web server add data to an SQL database held on its server, preferably searching for each row of data that is about to be added to check that it doesn't already have it - to make the process resilient to temporary outages which would otherwise cause it to miss data if there was no catchup mechanism.

It's on the "round tuit" list - a nice little refinement to what I can do locally at present by searching the local CSV files that Cumulus creates.

I'll probably write the query-and-display software first, manually uploading each month's data into the SQL table, and leave the automatic cron-driven uploading with resilience until later.

Reply to
NY

What, if any, control will you have over the format of data in the email?

I did something similar recently, and ended up using a web page and a bit of PHP to capture data from the user, which made parsing the message body trivial. Then I modified my mail system so that all mail sent by the web page ends up in a dedicated user. Its picked up from that user's mail queue by an overnight cronjob running a chunk of Java that reads the messages (using the javax.mail classes) and processes them. This doesn't use a database, but I know that the JDBC classes work well with PostgreSQL and, of course, there's also the Derby DBMS which interfaces directly with Java applications because its written in Java.

If you're unfamiliar with awk, that may be worth a look: its a script language designed for analysing and extracting information from plain text.

--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

Ah so the database is on the web server. You will have find out what the database is (e.g. MySql), the IP address, and the username and password to log on to it. You can then use a suitable Python package such as myslq-connector-python which can use that information to get a cursor on to the db. Its then a case of issuing sql commands, in the same way as for a local SQL db.

---druck

Reply to
druck

On Thu, 10 Sep 2020 20:38:41 +0100, "NY" declaimed the following:

Before worrying about the cron job, you need to determine what capabilities the remote site provides.

If you have direct access to the database server itself (with an account/password for the DBMS and suitable privileges for add/delete/update of records) then you can run a script issuing SQL DML statements meaning all the checking for duplicates, et al can be done at your end.

BTW "SQL database" is meaningless. SQL is a Query Language, not a database engine. "SQLite" OTOH is a database engine that uses SQL as the query language -- but practically any relational model database engine uses SQL as the query language.

If, OTOH, all you have access to is a web server interface, you will be limited to submitting form data and receiving/parsing returned web pages.

--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber

The normal solution for simple webhosting with no outside facing database, is: make a PHP script on the web server that receives your data, maybe via a form upload, maybe directly as a query parameter; that processes the data and connects to the database to save it. AKA build your own API. Make sure you add some sort of login and/or secret key and/or checksum. Also make sure that you sanitize your input before throwing it to the db, even if it is just you!

Reply to
A. Dumas

OK, the easy start way for this kind of thing is to create a script that runs under cron on the Pi.

That will 'push' the data to the web server.

The web servers side is not too hard - simply create a page that takes form variables so e.g. you can call it with:

http:weather.mysite.com/upload.php?temp=37.2&pressure=987.9....and so on

To upload simply use curl in the cron script. How you read the data and get it to curl is your problem!

formatting link

--
?it should be clear by now to everyone that activist environmentalism  
(or environmental activism) is becoming a general ideology about humans,  
about their freedom, about the relationship between the individual and  
the state, and about the manipulation of people under the guise of a  
'noble' idea. It is not an honest pursuit of 'sustainable development,'  
a matter of elementary environmental protection, or a search for  
rational mechanisms designed to achieve a healthy environment. Yet  
things do occur that make you shake your head and remind yourself that  
you live neither in Joseph Stalin?s Communist era, nor in the Orwellian  
utopia of 1984.? 

Vaclav Klaus
Reply to
The Natural Philosopher

As I said, 'curl' will send data to the server, as long as you can get data to the command line of curl...

server side PHP to access an SQL database is tedious, not hard.

Seriously the cron driven stuff is trivial.

And a PHP script to load the data into a database is also pretty damned trivial.

Takes a lot longer to design a pretty UI with knobs on..

--
You can get much farther with a kind word and a gun than you can with a  
kind word alone. 

Al Capone
Reply to
The Natural Philosopher

Amen to all of that.

Here is the core code for updating a very simple MySql database from a form variables in php

$fields=array( "pressure","temperature",....); // this defines all the variables you will send as form variables $query = "insert into data set"; $flag=0; foreach($fields as $name) //read variables and add to query { if($flag) $query .=','; if(isset($_GET[$name])) $query.= sprintf(" %s='%s'",$name,$_POST[$name]); else $query.= sprintf(" %s='%s'",$name,""); $flag++; } ; // echo $query; $link=mysqli_connect("localhost", "myuser","mypass","weather"); if($link) { mysqli_query($link, $query); } mysqli_close($link);

And as for cron,you need to use:

curl myweather.mydomain.com/upload.php?pressure=1005&temperature=37.5&.....

It really is that simple. Of course there is no data validation done and not much security there,but that you can add in later on once it actually works

--
The biggest threat to humanity comes from socialism, which has utterly  
diverted our attention away from what really matters to our existential  
survival, to indulging in navel gazing and faux moral investigations  
into what the world ought to be, whilst we fail utterly to deal with  
what it actually is.
Reply to
The Natural Philosopher

Unlikely to be allowed to connect to DB directly - have to use a web interface.

Heck I run my own servers and I NEVER expose the mysql server port to the internet.

Only absolutely specific web pages that update very specific things in a very defined and validated way.

SQL programming done on the server by a web script custom written for the job.

--
"When a true genius appears in the world, you may know him by this sign,  
that the dunces are all in confederacy against him." 

Jonathan Swift.
Reply to
The Natural Philosopher

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.