EmbeddedRelated.com

Signal Processing Contest in Python (PREVIEW): The Worst Encoder in the World

Jason Sachs September 7, 20136 comments

When I posted an article on estimating velocity from a position encoder, I got a number of responses. A few of them were of the form "Well, it's an interesting article, but at slow speeds why can't you just take the time between the encoder edges, and then...." My point was that there are lots of people out there which take this approach, and don't take into account that the time between encoder edges varies due to manufacturing errors in the encoder. For some reason this is a hard concept...


Lost Secrets of the H-Bridge, Part III: Practical Issues of Inductor and Capacitor Ripple Current

Jason Sachs August 24, 20133 comments

We've been analyzing the ripple current in an H-bridge, both in an inductive load and the DC link capacitor. Here's a really quick recap; if you want to get into more details, go back and read part I and part II until you've got equations coming out of your ears. I promise there will be a lot less grungy math in this post. So let's get most of it out of the way:

Switches QAH and QAL are being turned on and off with pulse-width modulation (PWM), to produce an average voltage DaVdc on...


Lost Secrets of the H-Bridge, Part II: Ripple Current in the DC Link Capacitor

Jason Sachs July 28, 2013

In my last post, I talked about ripple current in inductive loads.

One of the assumptions we made was that the DC link was, in fact, a DC voltage source. In reality that's an approximation; no DC voltage source is perfect, and current flow will alter the DC link voltage. To analyze this, we need to go back and look at how much current actually is being drawn from the DC link. Below is an example. This is the same kind of graph as last time, except we added two...


Lost Secrets of the H-Bridge, Part I: Ripple Current in Inductive Loads

Jason Sachs July 8, 2013

So you think you know about H-bridges? They're something I mentioned in my last post about signal processing with Python.

Here we have a typical H-bridge with an inductive load. (Mmmmm ahhh! It's good to draw by hand every once in a while!) There are four power switches: QAH and QAL connecting node A to the DC link, and QBH and QBL connecting node B to the DC link. The load is connected between nodes A and B, and here is represented by an inductive load in series with something else. We...


Adventures in Signal Processing with Python

Jason Sachs June 23, 201311 comments

Author’s note: This article was originally called Adventures in Signal Processing with Python (MATLAB? We don’t need no stinkin' MATLAB!) — the allusion to The Treasure of the Sierra Madre has been removed, in deference to being a good neighbor to The MathWorks. While I don’t make it a secret of my dislike of many aspects of MATLAB — which I mention later in this article — I do hope they can improve their software and reduce the price. Please note this...


How to Estimate Encoder Velocity Without Making Stupid Mistakes: Part I

Jason Sachs December 27, 201230 comments

Here's a common problem: you have a quadrature encoder to measure the angular position of a motor, and you want to know both the position and the velocity. How do you do it? Some people do it poorly -- this article is how not to be one of them.

Well, first we need to get position. Quadrature encoders are incremental encoders, meaning they can only measure relative changes in position. They produce a pair of pulse trains, commonly called A and B, that look like...


Linear Feedback Shift Registers for the Uninitiated, Part XIII: System Identification

Jason Sachs March 12, 20181 comment

Last time we looked at spread-spectrum techniques using the output bit sequence of an LFSR as a pseudorandom bit sequence (PRBS). The main benefit we explored was increasing signal-to-noise ratio (SNR) relative to other disturbance signals in a communication system.

This time we’re going to use a PRBS from LFSR output to do something completely different: system identification. We’ll show two different methods of active system identification, one using sine waves and the other...


Return of the Delta-Sigma Modulators, Part 1: Modulation

Jason Sachs May 28, 2023

About a decade ago, I wrote two articles:

Each of these are about delta-sigma modulation, but they’re short and sweet, and not very in-depth. And the 2013 article was really more about analog-to-digital converters. So we’re going to revisit the subject, this time with a lot more technical depth — in fact, I’ve had to split this...


Linear Regression with Evenly-Spaced Abscissae

Jason Sachs May 1, 20181 comment

What a boring title. I wish I could come up with something snazzier. One word I learned today is studentization, which is just the normalization of errors in a curve-fitting exercise by the sample standard deviation (e.g. point \( x_i \) is \( 0.3\hat{\sigma} \) from the best-fit linear curve, so \( \frac{x_i - \hat{x}_i}{\hat{\sigma}} = 0.3 \)) — Studentize me! would have been nice, but I couldn’t work it into the topic for today. Oh well.

I needed a little break from...


A Second Look at Slew Rate Limiters

Jason Sachs January 14, 2022

I recently had to pick a slew rate for a current waveform, and I got this feeling of déjà vu… hadn’t I gone through this effort already? So I looked, and lo and behold, way back in 2014 I wrote an article titled Slew Rate Limiters: Nonlinear and Proud of It! where I explored the effects of two types of slew rate limiters, one feedforward and one feedback, given a particular slew rate \( R \).

Here was one figure I published at the time:

This...


Debugging DSP code.

Mark Browne May 1, 2019

I am fascinated with neural network processing and have been playing with them since the 80's.

I am a frequent contributor to the Numenta forum. Numenta is the current project of Jeff Hawins, the guy that gave us the Palm Pilot. They are working with the HTM model. This is a system based on studies of the functions of the cortical column and has some very interesting properties: It processes sequential data streams and has very effective one shot learning. The data is arranged in Sparse...


Unraveling the Enigma: Object Detection in the World of Pixels

Charu Pande February 8, 2024

Exploring the realm of embedded systems co-design for object recognition, this blog navigates the convergence of hardware and software in revolutionizing industries. Delving into real-time image analysis and environmental sensing, the discussion highlights advanced object detection and image segmentation techniques. With insights into Convolutional Neural Networks (CNNs) decoding pixel data and autonomously extracting features, the blog emphasizes their pivotal role in modern computer vision. Practical examples, including digit classification using TensorFlow and Keras on the MNIST dataset, underscore the power of CNNs. Through industry insights and visualization aids, the blog unveils a tapestry of innovation, charting a course towards seamless interaction between intelligent embedded systems and the world.