What was your greatest achievement in coding?

Hello, the question is in the thread name.

Warning : I'll shoot without notice anyone saying "C pointers". C pointers are basically trivial, as long as there are no '*' charter in your C codebase 😆

I start first: the scope is a context switch in my toy operating system. The issue was a dependency inversion where a lower layer directly calls a higher one. The solution was to implement a callback and to do this cleanly I added a functional interface, so every layer has a common fixed interface.

The implementation in C language, I show only relevant code snippets:

lower layer API header:
C:
typedef hal_stack_word_t *hal_timerSchedCallback_func_t(hal_stack_word_t *stack_pointer);
typedef hal_timerSchedCallback_func_t *hal_timerSchedCallback_ptr_t;

void hal_timerSchedSetCallback(hal_timerSchedCallback_ptr_t func_ptr);

lower layer code:
C:
static hal_timerSchedCallback_ptr_t sched_callback = NULL;

void hal_timerSchedSetCallback(hal_timerSchedCallback_ptr_t func_ptr)
{
    sched_callback = func_ptr;
}

ISR(TIMER1_COMPA_vect, ISR_NAKED)
{
    [...]

    // scheduler callback
    if( sched_callback != NULL ) { sp_next = sched_callback(sp_current); }

    [...]
}

higher layer code:
C:
static hal_timerSchedCallback_func_t tm_schedulerRR;

void tm_schedulerInit(void) { hal_timerSchedSetCallback(tm_schedulerRR); }

[...]

hal_stack_word_t *tm_schedulerRR(hal_stack_word_t *stack_pointer)
{
    mod_thread_item_t *thread;

    // save current thread context
    thread = mod_threadGetPointer(mod_threadGetCurrent());
    thread->stack_pointer = stack_pointer;
    
    // switch thread and context
    [...]

    thread = mod_threadGetPointer(mod_threadGetCurrent());
    return thread->stack_pointer;
}

Please be kind, it's the first time I've implemented something this complicated, and I know there's still a lot of testing to do.

Anyone have something to show?
 
Once upon a time, a million years ago, on a Microvax ...

A VMS host held a region of shared memory with a large binary tree of merchant transactions. Each night transactions were gathered up and transmitted so that merchants could get paid. The design called for multiple processes to map the shared memory. The were jobs to do lookups, posting, reporting, accounting, and the end-of-night settlement. Since the binary tree was full of pointers, each process mapped the shared memory to the same high address. High enough to accomodate the largest program.

The high address required configuring the host OS - Vax VMS - to allocate a very large VM page table for each process. This slowed down process activation and consumed extra RAM. Slow enough that management took notice.

I proposed to re-write the AVL tree library to use offsets instead of pointers. Think "PIC" but for data. Management signed off, a proof-of-concept demonstrated, and soon all jobs were updated to use the new library.

We were able to reset the VM page table size back to default. Each process was free to map the shared memory to wherever it pleased. Overall system performance improved. Most important was no down-time or loss of merchant data during the upgrade.
 
Time-relative: When I was 18 I made a fish weighing and sorting program that was used by hundreds of boat personel. It was in Turbo Pascal. Later added a RS232C datalink program to get things to early Windows with its total garbage com port communication.
 
On a 8bit Microchip PIC18F4520 µC I needed floating point operations to realize a PID controller.

I'm talking Assembler, of course, which I prefer when timing becomes critical, because by using C you need to check the machine code produced anyway. At least count the instructions the compiler produced, which need to be distinguished into 1, 2 and 2-or-3 cycle instructions, or at least stop watch the time of some parts of code, which is a tricky job of its own to measure at the level of time spans of single machine code instructions.
But besides also mostly the C compiler produces more machine code than me programming directly in Assembler, worst part of using C (HOL) on µC is: If the compiler is updated (new, "better" version), it can happen (without notice, of course), the machine code produced from your totally unchanged, exact identical source may differ - not seldom become (a tiny bit) longer. I had that more than once. So, I have no trust in using C compilers on µCs when time is crucial critical, like in control loops.

Since no floating point given on that thing, I defined my own not IEEE 754 conform 10 bit format (in a 8bit memory), and using basically only the processor's arithmetic functions add, substract and complement (the internal multiplication wasn't good enough for my purpose.) I created floating point numbers (10bit can already be pretty precise enough for many practical things) with the according four basic arithmetic operations add, substract, multiply and divide by my own.
While divide was a hard nut to crack, since it was hard to find any useful explanations how division is done in binary numbers; I found easily hundreds of papers and webpages can elaborate detailed and looong all again and again from the very basics of what binary numbers are, and then how addition, and substraction with complement is done, elaboratd over pages for dummies, sometimes even multiplication was briefly sketched. But when it came to division, end of story. Today you'll find that also in wikipedia.

However, I realized not only practical usable floating point on that thing, but the PID controller (was trivial when I solved the first part) did a really pretty good job.
Many years later I'm still a bit proud of that. 😇
 
On a "speech chip", aboards with dsp's, there was an error, so i created a preprocess algorithm to solve it.
I did a Z-transform to show this algorithm was stable.
The algorithm was as simple as NEW_VALUE=0.99*NEW_VALUE+0.01*OLD_VALUE.

Created a partition manager for FreeBSD , using Dlang+GTK, disk sizes presented by log of their size.
Well that was until my SSD suddenly died.
 
Comming up with misaligned return address traps so a profiler would pick up the right return point.
On the 68k, code needs to be at even locations. So, when you profile your code, you patch all the function entries but can not patch all the exit places to stop the clock for that frame. Here comes the trick, upon entry of the function, the trampolin code would add 1 to the return location on stack. When this address gets jumped to, the CPU immediately pulls the misaligned instruction trap. Sub 1, and restart. But now you have the exact start/stop for a function and the place this call happened.
 
One of my projects was a 286/386-PC based satellite groundstation, written in assembler, which got installed into several african and european countries. It did real time satellite data acquisition and image processing for remote sensing applications. It captured long-duration (eg, days or weeks) time sequences of satellite images and calculated cumulative data products from them on the fly. The whole system comprised a dish/LNA, microwave satcoms receiver, and one or more PC's fitted with custom interface cards that I also designed, that were configured to capture and process selected parts of the earth image in realtime, and an expensive DEC colour printer. The system was a full-rate, primary transmission PDUS, not a secondary wefax system. The hardware used a very expensive memory card which had what was at the time the huge amount of 8MB RAM (built from 256k drams), and it had an early tseng labs ET3000 https://en.wikipedia.org/wiki/Tseng_Labs super-VGA graphics card that I wrote image display software for to show high resolution pictures on the monitor; high resolution in those days being probably 800 x 600. We also experimented with early WORM optical drives for data archiving (this was before CDRW's were available, or even PC CD drives).

It's not possible to make systems of that kind nowadays because for many years now ESA has encrypted the data streams from their satellites, you have to buy an expensive decryption card from them now and pay them license fees if you want to make your own PDUS today. Back then, ESA thought nobody without a 10m dish and a $500k mcdonald-dettweiler stack would be able to capture and process the unencrypted data,... they were wrong. 😁

I managed to keep hold of one of the prototype interface cards, shown in the photo below, about the only remnant of that system I have. This is going back in time a bit, as you can tell from the board. It's a 2-layer, PTH vias board built on FR4 https://www.pcbmaster.com/news/what-is-fr-4.html with gold-plated edge connector. The design includes 2 high-speed serial ports based around the 6402 CMOS UART and a PIO port, both for talking to the satellite receiver, and has a number of other features including an onboard hardware watchdog timer. Amazingly it appears that UART is still available: https://www.renesas.com/en/document/dst/hd-6402-datasheet . That is a nice UART. The rest of it is mostly good ol' TTL. The serial ports were driven by a Texas 75176 line driver chip https://www.ti.com/lit/ds/symlink/sn75176a.pdf , and I used a CMOS 555 (yes! there had to be one!) to create the free-running watchdog timer https://www.ti.com/lit/ds/symlink/lmc555.pdf ; even if the system clock stopped... the watchdog timer was guaranteed to pull the RESET line on the system bus and reboot the box, whereupon the data acquisition would automatically restart. While the software was running, it had to clear the timer register every few seconds or the watchdog would fire, so any software crash for any reason whatsoever resulted in a warm restart. Not very subtle compared to modern designs, but "if in doubt... use brute force". The main objective was to miss as few image transmission time-slots as possible, and the approach was successful. Crashes were very infrequent anyway, it would stay up for weeks at a time, provided there was power. The later versions of this board replaced those horrible cheap ceramic decoupling caps you can see with some better quality tantalum types.

The software itself is sadly lost in the mists of time...

interface-card.jpeg


And this is a color print from the DEC printer (I have forgotton the model) of a whole earth image captured by the system at the time. That printer was just about the highest resolution we could find, I remember it was a very expensive piece of kit. The image shown is in false colour, after being processed by software from the raw captured data that is greyscale. I think this was taken by the visible light radiometer on the satellite. The colours have faded a bit but it's not bad considering this printout is almost 40 years old; the paper has yellowed a bit too. (Actually... I'm starting to wonder if it was an early HP inkjet rather than a DEC; my memory might be wrong on the make. I remember it was expensive).

whole-earth.jpeg


I don't know if this was any kind of "greatest achievement" 😂, but it was certainly one of the more interesting projects I have worked on.
 
Writing an entire utility in ASM for the original IBM PC using a single line text editor EDLIN.

It was complete with self modifying code to maintain its own file allocation tables and the 8087 numeric processor, if present.
...Piped into debug.exe to create a .com program.

I wasn't allowed to use any commercial software or shareware that the customer had not purchased on any PC supplied by the IBM reseller I was contracted to. This was a job building thousands of corporate PCs tailored to each department's requirements. As the machines were all running IBM PC DOS 6.2, none had a BASIC interpreter. I wrote lots of little hardware detection utilities using edlin and debug to scan for ROM BIOS messages for network and graphic card detection, testing I/O, partitioning and formatting hard drives. These went onto a special boot floppy that would automatically log into a server, detect where it was by network segment and start the build process. When the technician entered the asset number, the PC build would commence a fully automated install of Windows 3.1 with all device configuration decisions for INI files resolved in advance by my utilities. The asset number was the only data entered to build the PC.

I created another automated build system for Windows NT a few years later for a large corporate. This used compiled source code that I had written for them. It worked better than the major PC supplier's own factory build system, When my customer told the PC manufacturer that they could build four times as many PC's in a day per technician than the PC supplier could they didn't believe them. The supplier delivered a 100 PCs to my office and I was instructed to demonstrate the build system to them. I remember the visiting manufacturer's executives saying that their technician was spending more time taking packing materials out to the truck than setting up the PCs.

My customer, struck a deal with the PC supplier and my code was handed over to them. I had written the code without the customer requesting it but under contract terms where the customer owned all rights to work done during the contract. I was young and I had not negotiated my own terms in advance. As I had already been paid for my time, the code was theirs. I was never asked to support the code, but I do know that it was used in a number of regions to build many thousands of PCs by that supplier. There were techniques that I used in that build system that I have since seen used in later DevOps systems. I am not claiming that anyone copied me, just that I managed to create a fully automated Windows NT 4 build system in the last century that could fully configure/reconfigure a machine to the role of the eventual user with the specific applications required when they first logged in on it. It is the code that I am most proud of and I have never written another build system for any one else since.
 
My first program was written in PL/I —it was a program for representing 'amounts in words' on bank checks (my mother was involved in automating the central accounting departments of various ministries—the Ministry of Education, the Ministry of Justice, etc.). I was 15 years old (1980). She gave me this task as a joke. But later, it was my code that was used in automated systems.

This was my first significant achievement in programming...😎
 
Yes PC Classroom when i was 15. It looked like a sort of mainframe in the middle of the class, connected to dumb terminals on each sheet.
To boot, the teacher put if very large floppy disks.
 
Time-relative: When I was 18 I made a fish weighing and sorting program that was used by hundreds of boat personel. It was in Turbo Pascal. Later added a RS232C datalink program to get things to early Windows with its total garbage com port communication.
Not to derail this thread, but I’ve done similar stuff with feed trucks (the ones that feed cows) in order to talk to the scale head. I used an incredibly customizable software called BillRedirect. If you want a program to talk to com ports, there is no better. Period.


Windows obviously, as most software is these days.
 
Blackbird , I see the uart, some 74 chips & SN chips. Nice. Clockspeed will have been low.
Yes, standard PC 16-bit ISA bus, 8 MHz bus clock I think. Don't look too closely at some of the sharp right-angles in the traces(!)... but there was no real problem with reflections etc at that speed, at least, the boards I designed all worked fine. Later on I got a CAD system that did autorouting for me, but that board was hand-drafted using transfers onto film, for both sides. The UART itself was clocked by the 4 Mz xtal oscillator that you can see beside it. I can't actually remember the serial line bit-rate, it's quite a long time ago. There was no buffering at all in the satellite receiver... I had to parse the data stream live as it came down, ie while it was being transmitted, and capture it live. So the PC really was doing the basic data acquisition of the images. There was far too much data to capture the whole image and then pull out the segment you wanted; you had to parse each frame as it came down and select the scan-line segments you wanted in realtime. That was all done in assembler of course. It was quite good at the time.. it would be much easier today with modern hardware, but then again, the downlink data rates are much higher today. :)
 
Back
Top