Packt Publishing is currently running a free e-books offer, with each book available for one day.Â So if you’re a developer or thinking about becoming one, it’s worth checking out every day until they discontinue it.Â (Last year, they did the same thing, but ran it for a much shorter period of time).
I’m pretty sure that the free offers will not include their latest or best selling titles – duh! – but it’s still a very good offer.Â Packt does tend to concentrate on open sourceÂ and open standards.
Packt does not offer books for hard-core industrial automationÂ programmers (for example, no PLC books), but some of the books that were already offered for freeÂ could be usefulÂ forÂ factoryÂ software projects that use mobiles (June 17, Creating Dynamic UI with Android Fragments), web standards (June 22, HTML5 Data and Services Cookbook), databases (June 18, Raven DB2.x Beginner’s Guide), or development tools (June 19, Learning Gerrit Code Review).
Thanks to Packt, I’ve discovered some potentiallyÂ useful projects (such as Learning Gerrit Code Review)Â Â and have been able to snag free books on topics I’ve been interested in but haven’t used yet (such as Instant R Starter).Â I’ve also bought some books from Packt, such as Learning BeagleBone.
June 22, 2015 1 Comment
I’m thinking about re-writing an application, and have been looking at adding in a little bit of parallelism.Â I suspect it’s like many automation applications: there isn’t any potential for massive speed increases from using a multi-core processor, since most of the time is spent loading/unloading the part and in sequenced motion.Â The data handling is so quick on a modern CPU that there’s no point trying to speed it up; instead, I’m looking at doing, say, network access in parallel with motion.
There are a wide variety of approaches to parallel programming, including:
- Multiple processes, which is very heavyweight.
- Traditional threading.Â Most programmers find it very hard to write bug-free multi-threaded code.
- Asynchronous calls, which has limited scalability but can still add considerable complexity.
- Actor model, used in Erlang and Scala.
- Software Transactional Memory, used by Clojure and Haskell.
- Agents, used by Clojure.
- Dataflow variables, used by Oz programming language.
- Dataflow programming, used by LabView.
- Microsoft’s Task Parallel Library supports various techniques including parallel For/Foreach loops, parallel invoke, parallel LINQ, and actors.
- And I’m sure there are many more…
Andy Glover and Alex Miller discuss many of these approaches during an information-packed IBM developerWorks podcast.Â I’m certainly no expert, but I strongly believe that there won’t be one dominant approach to parallel programming, and I don’t think parallel programming will ever be easy.Â Just creating a good (meaning maintainable, extensible, testable, and reliable)Â single threaded program isn’t easy; adding parallelism adds another layer of complexity.Â A naive parallel program can actually take longer to run than a single threaded program.
There are also a variety of goals: do you want parallelism to speed up massive calculations (such as simulations), to scale to a massive number of users (such as web programs), or for extremely high reliability (such as telecom switches)?Â I highly doubt these different divergent goals will have the same solution; for example, GPUs can be great for speeding up simulations, but won’t help with telecom reliability.
So that’s why I get skeptical when companies promote their approach as “painless parallel programming” with wonderful speedup.Â Sure, you might get that promised speedup by replacing a 2-core CPU with a 12-core CPU, but only if your problem and your approach to that problem is well suited for that tool’s approach.
For my problem, I have various constraints (such as .NET is highly preferred, I need others to be able to maintain the code (so no F#), and I value simplicity over performance).Â I’m looking at either traditional threading, using the Microsoft TPL, or an Actor/message passing approach.
As a side note, theoretically PLCs should easily handle parallel programming, since they’re based on combinatorial logic.Â Just create a PLC to FPGA compiler that translates the entire PLC program to gates in the FPGA, and run your PLC program simultaneously, without a scan sequence, at MHz clock rates!Â The problem, of course, is that most PLC programs rely on the order within the PLC scan sequence, and many advanced PLC functions don’t easily translate to FPGA logic.
July 10, 2012 No Comments
“A Bad Technician Is Worth Negative Money” is something I said a lot back in the days when I had to go around and fix all the stuff the night shift technician had screwed up. A technician who causes problems is worth negative money because not only does he not do his job, he sucks up the time of others who must fix his mistakes.
Larry O’Brien comes to a similar conclusion about software developers: bad programmers are not slow programmers – they are programmers who are actively counter productive to the code base. In a fascinating post, he argues that the goal isn’t a silver bullet for programmer productivity, but a silver codebase, which bad programmers make impossible. Larry started all this discussion by dissecting the myth of the super-programmer.
My take – he makes sense to me. I’ve had to clean up code from some, well, people who shouldn’t have been programming, and it was not pretty. I’ve seen how a well designed codebase can make adding functionality much easier. On the other hand, I currently have an inherited codebase that needs some serious refactoring before it’s anything close to silver.
February 1, 2008 2 Comments
I was going through an old stack of JDJ‘s (Java Developer Journals), and saw an interesting commentary on Test Drive Development (TDD). The author said its purpose was to help you write the minimal amount of code necessary to meet the current requirements, and that having a test suite available was a nice side affect. His TDD development sequence is:
- Write tests that show your requirements, based on the perfect client interface
- Write just enough code to compile successfully
- Write just enough so that all the tests pass
Now apply this idea to the world of factory automation software. Hmm, could be interesting, but it wouldn’t work with PLC’s (rllUnit anyone? I don’t want to even imagine porting xUnit to IEC61131.)
July 21, 2007 No Comments
Here is my quick overview of modern software development trends. It’s somewhat quirky, but software development is a big area, all the way from programming $0.50 PIC micros in toaster ovens to massive web services and supercomputers.
Where do I pick up these trends? From trade magazines (after applying my hype discount factor), web blogs, discussions with other programmers, and what works for me.
- Version Control and Bug Tracking that work. These are essential for good development, but I suspect that most automation developers, at least at small companies, still do not use them.
- Design Patterns – a good way of talking about software architecture.
- Unit testing and Test Driven Development – somewhat hard to apply to automation software, but definitely making waves right now in desktop and web development.
- Refactoring – restructuring and improving existing software, instead of doing a ground-up re-write.
- Agile Development – iterative, customer focused development instead of BDUF (Big Design Up Front).
- Dynamic languages – allow rapid development, better re-use of existing components, run time customizations.
- Mixed language development – use appropriate tool for the job. A desktop example is Adobe Lightroom, written in Lua and C. I do a lot of this (especially when including IEC 61131 as a language)
- Web technologies – interesting stuff, but I think factory software developers should not get caught up in the hype.
- Open Source – some very good products to add to your toolkit.
I plan on discussing many of these techniques in the future, using concrete examples of the tools I use and like. For example, I will use Subversion when discussing version control.
Comments 4/5/2011: still good stuff, but unfortunately I haven’t followed up (limited by lack of time and my fun excursions into PCB and MCAD land).Â I need to write more about these topics.
June 21, 2007 No Comments
Open source software is a great resource. If you’re going to use and modify open source code in your machine, then you have to look at licensing issues very carefully. But there is another way to take advantage of open source software – by using open source projects such as Subversion to enhance your development process with no licensing worries.
Even if the majority of open source software can safely be ignored, there are many rock solid, high quality, well documented, Windows-friendly projects. Examples include:
- Subversion for version control with TortoiseSVN client for Windows
- Trac for project management and bug tracking
- Apache web server
- xUnit for unit testing
- Wix for MSI installers (OK, its documentation is a little weak, but it’s still what I use to create Windows installers)
- Programming languages such as Python, IronPython on .NET, and Lua.
What are some of the advantages of these top notch projects?
- High quality. For example, Subversion has a better reputation than Microsoft’s SourceSafe.
- Low cost, both in money (nothing) and time (due to good documentation).
- Ease of deployment – with Subversion, I don’t have to worry about how many users there are. Adding another user costs exactly $0.00, unlike adding a user on a commercial version control system
- No licensing hassles; there are no hardware keys, no software keys, and no chance of any software license audits.
- More stable path – no forced upgrades (upgrade when you’re ready), no worries about the vendor going out of business, no vendor agendas to worry about.
- Often commercial help is available if desired – so far I’ve done fine with the documentation and Google.
June 14, 2007 No Comments