Assorted Coding Projects
Ever since I discovered that useful coding could be performed by an average person and not just by professionals for official purposes, I have been interested in finding ways to put it to use and learning useful coding languages and techniques. The first language that I encountered was Visual Basic, through which I attempted mainly to write code for games. One of my first successful programs, though not written by me alone - I had help from the person who was teaching me the language - was a simulated bowling alley. The user of the program could bowl down a lane, choosing when to launch the ball, and if the ball was aligned with the center, they would get a strike. It was a simple program, but any programming experience helps with building the mindset needed for coding and provides tools that will be valuable in future projects.
I tried to make an online video game but was in way over my head and had no idea of the real limitations of Visual Basic in the ways that I intended to use it, so I failed to complete that project, despite learning much from it about basic, core coding functions. Redeeming my ties with Visual Basic, during the car ride back from my First Robotics Tech Challenge competition that I had participated in with my team, I coded the game Pong in a few hours. I recently dug up my code for pong and found an old executable file as well that I had exported. As far as I can tell looking back, the code considers the angle that the ball impacts surfaces at, as well as whether it bounces off of one of the sliders’ corners or a flat surface when determining the reflection angle.
Two players could control the sliders using arrow and WSAD keys on the keyboard, and the game would display a “You Lose” banner on the side that had failed to deflect the ball every time. For added difficulty, I included code that would slowly move the sliders closer to each other, reducing the gap and thus the time between impacts. I know that if I had an internet connection during that car ride, I would have found some generic Pong impact noises and coded them into my program form to play during each ball impact, just for fun and more authenticity.
Since then I have branched out through C and C++ to windows batch and script files and math-oriented coding like Excel data processing and Matlab coding, while also trying my hand at mechatronics-geared languages like Robot C for FTC and Arduino for other microcontroller projects. I aim to continue improving my coding skills through research online as I have been doing, while always working on new projects to test and improve my skills.
C & C++
Following Visual Basic, the next language that I was exposed to was C, which I learned in my computer-engineering classes. C felt far more natural and rudimentary than Visual Basic which is why, to this day, it remains my favorite language. In some cases, I enjoy using C++ functions to shorten my work and remove clutter from my code, but even those functions are based in C. I enjoyed coding for such assignments in my classes as preparing a blackjack game to be played against a computer dealer, performing image manipulation by pixel value, programming a Pololu 3pi robot to traverse a maze, and coding an environment that enables a user to set a password according to given security rules requiring the use of certain types of characters.
In my free time, I coded Einstein’s Quiz in C and then exported it as an executable file so that I could share it with friends. The quiz is meant to be difficult to follow and solve but using basic logic and taking each clue one step at a time will always end the quiz at the same answer, if it can be completed. Using the 15 questions given in the original Einstein’s Quiz, which can be found online as well as its answer, I wrote a program in the C language using knowledge that I had gained from only my first C-coding engineering-computers class. The program delivers the quiz questions and tracks the amount of time that has elapsed since the quiz was started, awaiting an answer from the user and reporting the time that they took to complete it after they submit their answer.
Since some of the magic of the quiz is lost once the answer is known, I also included an option to submit one’s own variables into the quiz, of which there are 25, arranged into five categories. This way, anyone who had taken the quiz before could take it again with new variables and be forced to rethink through the logic, and the program would tell them if the answer was correct rather than allowing them to cheat and look online. This program was simple enough to code, but it took me a long time to iron out the kinks, teaching me a lot of code-organization skills and shortcuts along the way, so I am very proud of the final working product.
During the process of exporting thousands of bitmap images from Creo Parametric’s model-animation tool for conversion into videos for this site, I encountered an export-error that took me days to understand. I was told by Adobe Premiere that an error had occurred, and the file could not be exported. I searched for the error code online and was told that one of my input files was corrupted. Since bitmap images are stored at nearly 100 times the size of their PNG counterparts, I converted every bitmap to a PNG in order to save on storage and reduce the time that Premiere would take to load every file. I had assumed that these conversions were at fault for the corruption that Premiere was catching, so I started by re-converting them in case it was a one-time corruption error, which didn’t solve the problem.
Next, I checked the metadata of a known-to-be-working PNG image and compared it to one from the group that was causing the error. I found that there was a large amount of extra metadata attached to each image exported by Creo that could not be interpreted, while all working images had only the necessary data fields, so I went down a different path that I do not regret taking despite it ultimately not being the correct solution.
I dug up one of my old coding projects for my Computers in Engineering class which was designed to use C++ headers to manipulate bitmap images at the pixel level. I spent most of my time over two days re-organizing, researching and replacing code in the program to have it take an input bitmap file, read every pixel value and write each to a new blank Bitmap object, which would then be exported as an entirely new bitmap file containing virtually the same image as the original. I initially intended to pair this program written in C++ with a batch file that I had previously written, which parses through a folder, taking in the paths of every file as a variable one at a time, and executes some command with the path, in my case beginning a new instance of my C++ program and sending the path to it as an input. When I did this, however, it attempted to open 7000 instances of my new C++ program by the tens per second and bogged down my computer by attempting to simultaneously read and write 7000 bitmap images at two million pixels per image.
I quit this futile attempt after a half of an hour and found that it had only completed operating on around 1000 of the images and was beginning to open new instances of the program for the new bitmaps that my program had written in addition to the original bitmaps, since the output folder was within the original folder. I scrapped this idea and instead moved on to code the parsing function into my C++ program in the form of a while loop, executing one bitmap-rewrite at a time, and ditching the batch file entirely. It took several hours to get the code to work correctly, but when it finally functioned, it got all 7000 images done in less than a half of an hour.
I was deeply saddened to find that Adobe Premiere still had a problem with even my new images and I was forced to finally do what I should have begun with, finally leading me to the real problem. I was able to narrow down the location that the export failed to a certain percentage of the way through the encoding process and proceeded with the assumption that the percentage would be somewhat linearly proportional to the fraction given by the name of the culprit corrupted frame, where all frames were named in order by number, over the total number of frames. I didn’t want to search through all 7000 frames, so I started at the frame given by my calculation and began sifting in both directions away from frame 1120, which was 16% (where the program failed) of the way through 7000. Sure enough, I found a single frame that Premiere couldn’t load no matter what I did to it, so I copied its contents to a paint file and saved it as an entirely new image. Premiere accepted this new image and allowed me to complete my export.
So, in the end, my bitmap-rewriting program was completely useless for the application that it was meant for. It forced me to learn new file input-output techniques and apply my coding knowledge to a real problem, however, so I can accept having spent a lot of time working on it nonetheless.
In a very similar project, and more for fun and testing my coding skill than to solve an existing problem, I took that very same image manipulation code that I had written for class and wrote a separate program during my free time in 2017 for a different image editing purpose. The website Reddit had just completed their 3-day-long April Fools day project, a massively successful online image canvas prepared by Reddit allowed to be edited by any Reddit user. Participants were told that they could, using any Reddit account made before a certain date, change the color of any one pixel on a 1000x1000-pixel canvas to one of the 16 colors represented in 4-bit bitmap encoding, once per every five minutes.
Over the course of the 72-hour period that it spanned, 1.1 million users placed a total of 16.5 million pixels. I took part in the creation of the canvas and ended up placing a total of 18 pixels, one of which ended up remaining on the final canvas, and the connection that I had to the project, though small, motivated me to do something more with it once it had concluded, as many other users did. A Steam application for PC called Wallpaper Engine, which had at the time just recently been released, was designed to allow anything from videos to interactive programs and web browsers to run within the Desktop background on one’s computer. A few hours into the creation of the r/place canvas, I came up with the idea to create a time-lapse video from images of the canvas and upload it to Wallpaper Engine as a video background for others to use.
I attempted to write my own small batch file code to take screenshots of my computer screen once every few minutes, with the canvas centered on screen, but I was late to the party by more than a day, and my code fell short in many ways. I found that other people had gotten on board nearly instantly after the project commenced and used APIs to get and save updated canvases once every 5 seconds, so I allowed them to do the work of gathering the images instead since my time-lapse would have fallen short with my own images. Luckily for me, they made their images publicly available in a zipped file download, which ended up filling 149 gigabytes of storage on my computer.
This was when I wrote a batch script using ImageMagick’s convert function to efficiently parse through a folder and covert every one of the 49772 bitmap images to lossless PNGs for the purpose of more realistic storage and speeding up loading times. In their PNG state, they only took up and still take up 10.8 gigabytes of space on my drive which is far more manageable than 149 gigabytes.
The canvas for the image was 1000x1000 pixels, but most computer monitors at the time were 1080p monitors with a resolution of 1920 by 1080 pixels. I wanted the canvas to serve as a background for such computers, and I didn’t want to throw a boring black backdrop behind it, so I instead settled on writing a C++ program to fill that space and rewrite all 49772 images. I hadn’t touched my image-manipulation code in over a year at that point, so it took me a few days in my free time to write and execute the image-rewriting. I chose for my program to read each image as a 1000x1000 array of pixel values, then copy over each value to the 1000x1000 pixel section at the center of a new 1920x1080 pixel image. It then read the leftmost edge’s pixel values and applied each pixel’s color value to every pixel directly to its left, and did the same with the other three sides, effectively stretching the edge pixels to fill the rest of the frame. I then stretched the corner pixels to fill the remaining unassigned corner sections, so that no pixel was left unchanged.
One problem that I encountered was that I kept getting an error message saying that I was attempting to write pixel data to an area of the image array that didn’t exist. I was stuck with this problem for a while before walking through the program step by step and finding that I had my columns and rows swapped in two important for loops. With this, the program was completed, and I ran it for a few hours, letting it rewrite all of the images as bitmaps. Sadly, the C++ headers that I was using to manipulate my images required bitmap input and output files, otherwise I would have preferred to use PNG files for their storage-size benefits. This new set of images took up 309.8 gigabytes of space on my hard drive, so I once again converted them all to PNGs, at this point reaching 480 gigabytes in total used storage. I eventually deleted all of the bitmaps, but not before ensuring that none of the PNGs were corrupted or constructed incorrectly.
I also had the issue of accidentally translating the 4-bit bitmap inputs in my program to 24-bit bitmaps improperly, since 24-bit was the default encoding method used by my bitmap headers, which completely screwed up the color representations. I fixed this problem easily, but it was a funny mistake that resulted in some interesting looking images, with a color-value-swap occurring every few images.
Having exported all of the images intended for the final video, I moved them into an Adobe Premiere project, but found that in a 60-fps video, the entire time-lapse would cover 13.8 minutes. I decided to cut the time-lapse in half, removing every other image, to shorten it to 6.9 minutes in length and speed it up by a factor of two. I desired to pair a song with it and knew of no 14-minute songs but had one in mind that was almost exactly seven minutes long and it fit perfectly with my new video. In order to perform the splitting of the 49000 files to speed up the video, I wrote another batch file that parsed every file in a folder and copied every “n”th file, in order by name, into a new folder, where “n” was a specified integer, in this case two. I have used this batch file on multiple other occasions for similar purposes.
Once I settled an issue with the project’s field progression setting, changing it from the default “lower-field-first” to “progressive scan” to match my image import settings, I exported the video using the highest quality that I could achieve with Adobe Premiere using the widely accepted h.264 encoder to ensure that my video would be compatible with Wallpaper Engine. Within a few days of uploading my new wallpaper to Steam, a few hundred people had subscribed to use it as their computer background, and currently in 2019 it has reached over 1500 total subscribers. It is, by both rating and subscriber count, the most popular r/place-based wallpaper on the platform. The hard work that I did to make a good product both went to good use with thousands of people and taught me new ways to use old tools.
The final important C-program that I wrote, which is extremely simple yet conceptually fun, is a good transition into the Arduino codes that I have written because I wrote a program with an identical purpose for Arduino as well. When I began coding my Nixie Tube clock, I quickly found issues with the accuracy of my Arduino Mega’s ability to keep accurate time. I could watch on screen as my Arduino counted seconds next to my computer time (with its very accurate clock) and notice the divergence of the two within less than 30 seconds. As this amount of error is unacceptable when trying to program a clock, nearly 1% error causing the Arduino to be behind by one second per every 129 seconds, I thought that I could correct it by finding the factor that related the two times – real time and fake time, as I called them in my programs.
So, treating my computer’s clock as real time and my Arduino’s clock as fake time, I wrote two programs both called micros, based on the built-in micros function in the Arduino libraries. In C, run as an executable, I wrote a never-ending while-loop which on each cycle displays the number of microseconds elapsed since program-start. Using the “gettimeofday” function belonging to the sys/time.h header to call the time to a variable at the beginning of the program and once during each cycle, I could subtract the beginning time from the current time in microseconds and display the value to the screen. This program kept one of my computer’s four processor cores at 40% capacity and another at 25% during its duration, which translated to higher temperatures and noticeably sped up my fans. This is likely due to the fact, judging based on the time between numbers displayed by the console, that it was displaying a new number on average 10,400 time per second.
The next step would have been much simpler if I could have figured out how to read serial COM ports from within a C program, but at the time I didn’t have a clue that such a thing was even possible. It is possible to pipe the Arduino’s serial output through PowerShell and into the current CMD terminal and combine it with the data coming from the Micros program running simultaneously and export it into a file for analysis. Instead, since I didn’t know that was an option, I ran the computer and Arduino separately, with my Arduino printing to the serial monitor running its micros program and my C micros program running in the terminal, then wrote another batch file to take screenshots once per hour when possible. I later went through every screenshot manually, in order, and entered the most recent number displayed on both monitors in each image into an Excel table. I used Arduino’s built-in Micros function as a replacement for time.h’s gettimeofday to write the code for the Arduino, but everything else was the same, except for the Arduino’s measly output rate of a few hundred number-updates per second.
Using Excel, I processed those numbers which were taken over the course of 11 days, hoping to settle on some consistent factor that would keep the clock accurate. While the generated divergence-curve did converge to a consistent factor value, I noticed while it was being created that the difference in rates between the Arduino clock and my computer clock would fluctuate throughout the day, peaking at night and falling during the day. The only thing that I could think of to attribute this to was changes in temperature in the room, and sure enough, upon researching official accurate clock modules, I found that they are all paired with a thermometer to track and consider temperature fluctuations, which must affect the resonance-rate of the quartz timing crystal.
With my factor determined though, I pressed on, hoping that it would still prove accurate throughout the year. I predicted after a few days of running my working Nixie clock next to my computer that it would only diverge from real time by less than a minute per year, which I could live with, but when I ran the clock in the summer it was off by almost 30 seconds in less than a week. Thus, I switched to a two-dollar eBay-bought Arduino temperature/time module called the ds3231 which in its five months running thus far has diverged by only roughly 10 seconds.
Arduino Mega has made my Nixie clock possible, since it required many output-pins and the capacity to run my clock program, which stores and will be able to display the date, temperature and time, and uses the date to determine daylight saving time and weekday. My roommates and I used Arduino Nano to control our RFID door-lock during my junior year in college, giving me more exposure to Arduino. It really has come in handy knowing how to code in Arduino’s language, and not just in classes. The similarities between the Arduino language and C made it easy for me to pick up after taking a few C-based classes. Though I barely did any of the coding for the door lock, I did look at the code that was written and was able to understand its logic. I plan to model, code and assemble a similar RFID locking mechanism for my living unit in the coming year.
Just as Excel helped me with determining the real-to-fake time factor in my clock, I have found it useful in many other math-related scenarios considering large data-sets, and for visualizing trends in data. Once the clock tuning was completed and I had moved on to using the ds3231 for my timing needs, I decided that I wanted the clock to correct itself for daylight saving time as well. Using Excel, I typed up the equations used to solve for the Julian Number, a number representing any Gregorian or Julian date, which could be used in math operations to determine the weekday, and to perform simple adjustments to the input-date with simple addition and subtraction and convert back to the Gregorian system.
The code that I wrote into Excel allowed me to understand the mechanism behind finding the Julian Number and test to ensure that it worked for every day in every year that it was meant to cover. Once I had that code settled and working, I had a new problem with the clock to deal with, which I also used Excel to solve with its conditional formatting settings. When the clock is initially plugged in, the module, with its CMOS battery, has the current time stored. All that the Arduino needs to do is read it and begin displaying it. However, the clock has no idea whether daylight saving time is active or not, and thus the flag variable responsible for storing that state cannot be correctly set. I needed to write code that could use the date and day of week combined with time of day, covering every possible time and date combination, to determine whether daylight saving time was active or not. I was having trouble writing a line of code that could successfully find if the day on which daylight-saving time changes had already passed or was still ahead, if the current date was within the weeks that the change was to occur.
I brought every possible month day and weekday combination into Excel and created a 2D 7x7 array of the combinations, then checked the formulas that I came up with to separate situations when the day was yet to come from those where it had already occurred. I conditionally formatted each output cell to display as white if it was marked as before daylight saving occurred and to display in red if after daylight saving, then tweaked my function until the output cells matched the expected values. I moved this new formula back into my Arduino code for November and March DST situations and coded a function to run at program start and determine the daylight-saving status from the date and time.
I determined that when (15 – monthday - (7 - weekday) <= 0) in March, daylight-saving time was not active, and when (8 - monthday - (7 - weekday) <= 0) in November, daylight-saving time is active. Any other time in the month does not consider both day of the week and day of the month and is much easier to filter with simple comparative if statements. I have used Excel for many of the calculations and for easy-to-access data storage in the design of my lathe, keeping track of the weights, costs and quantities of the lathe-materials, determining stresses under expected loads, and determining how to machine certain parts.
Most notably, I used Excel’s conditional formatting on the output of some simple matrix math to help me reverse-engineer the number of teeth on the gears in a professionally manufactured lathe’s drive train. I created a matrix containing outputs of the gear ratios given by possible gear-combinations and subtracted that matrix from another matrix containing only the expected ratio given by the known rpm outputs of the gearbox. I then conditionally formatted the resulting matrix to display the values closest to zero in red and higher-magnitude values in white, then visually searched for the smallest-magnitude values in that matrix for those that seemed most likely to be the correct combination. This helped tremendously, and I do believe that I was able to use these figures to make an exact replica of a certain Colchester lathe’s drive-gearbox.
Similarly, I knew the threads-per-inch outputs of most generic quick-change gearboxes used for thread cutting and automatic-feed-cutting on lathes, and I was able to piece together the most likely number of teeth on each gear in the gearbox using analysis of the steps in tpi.
While designing a bevel gear in Creo Parametric using standard formulas, I found that it would be easier to find the input values for my model if I typed those formulas in Excel and created formulas dependent on cell fields into which I could enter values specific to the gear that I was designing. This way I wouldn’t need to manually calculate each value, and this method would allow me to return with a different gear problem later on, enter the new values and proceed. It was a simple tool but was fun to make and use.
Excel provided the perfect interface in which to edit the massive amounts of data collected in my roller coaster acceleration recording endeavor. I was able to linearly apply shifts to sections of values, add translations, invert values, find the derivatives of the curves and adjust the input data to force boundary constraints of zero velocity and zero acceleration at the beginning and end, and apply smoothing algorithms to the data for better visualization. Then, using the available simple plotting techniques, I was able to plot all three motion axes onto three separate scatter plots for acceleration, position and velocity of the coaster to visualize and confirm the adjustments that I made, as well as, with the smoothed data, attribute times and data peaks to the events of the ride.
Matlab is able to perform similar feats to Excel, but is more difficult to code in my opinion, and is less visually-oriented. Excel’s cell-based structure greatly helps with real-time visualization of formulas and trends while Matlab offers more of a “trial and error” and “brute force” sort of environment. Even still, it has its uses, especially in recursive functions or iterative functions, which Excel as far as I know cannot perform. Matlab has been very useful in classes requiring the calculations of stiffness matrices and such, and in performing stress and strain analysis on said matrices, as well as approximating function values using numerical analysis approaches and finding the error in such approximations with respect to number of iterations.
Coding meshes with boundary conditions, forces, temperatures and moments applied to them in Matlab can be extremely useful in finite-element analysis which Matlab is particularly efficient and fast at solving. Similarly, Matlab is quite good at running through many iterations of functions using data from the previous run in each subsequent iteration. It was used to solve nearly every problem given to us students in my Numerical Analysis class, ranging from Secant and Newton method implementation to minimax approximation.
The one time that Matlab has saved the day so far in my own projects was when I was attempting to use Excel to minimize the result of force and moment balancing for the cross-slide on my lathe. I wanted to determine the maximum downward force on the furthest possible cantilever that could be exerted on the mill table that I designed and mounted on the cross slide without permanently damaging the linear bearings that the carriage is set to ride on. Had I continued guessing and checking in Excel, it would have taken hours for me to finally converge near the maximum moment-arm and minimum force value that would cause permanent damage.
Instead, I coded the mesh geometry and maximum allowed forces into a Matlab program along with the math involved in calculating the moments given by those forces around an iteratively adjustable center point. I determined that the intersection point between three functions would maximize the force, thus converging both on the center point about which the moments were performed and a more precise value for the maximum force. I did not need a very precise value, as even slight manufacturing differences would change these final values by a fair amount, but I wished to test the limits of my program. I ran it several times, each time further narrowing down the x, y and z range that it focused on, knowing that the triple-point where the three functions were equal was within that region, until I got a value precise to one ten-billionth of a pound.
While Matlab is certainly capable, and I expect that I will use it a lot in my engineering career, I am much more fond of the small yet extremely useful and versatile batch files and Visual Basic scripts that can be made to run in Windows. For example, I was able to use some basic lines of Visual Basic code in a few scripts to operate the most basic music control functions in iTunes, if it were open on my computer. I could run a script that would send the play function and iTunes would begin to play a song.
The exciting application of this for me, though, was through the use of Cortana on Windows, since the “AI” was not given almost any ability to control third-party programs. I saved shortcuts to my scripts in the start menu folder, which the “AI”, Cortana, pulls from when considering what program to start, and all that I had to do was ask her to run the script with the name “Play” and she would run it, thus iTunes would begin to play the current song. I wrote programs for play/pause, stop, forward and backward functions. They only operated correctly for a short time, until for some reason she could no longer recognize the shortcuts by their names, and my task scheduler setup to run the scripts without administrative permission prompts stopped functioning. It was fun while it lasted.
My keyboard has an extra set of five macro keys which I can program to perform certain functions, but the functions available to run within the keyboard’s software were too limiting for the applications that I desired to use them for. I wanted a key to be set to open chrome and instantly bring me to my bookmarks page. For this, I wrote a script that could be run upon keypress that would open and send keystrokes to a new chrome window or tab which would open the bookmarks page within that instance.
I combined both batch files and Visual Basic scripts in a successful attempt to use the “savescreenshotfull” function using Nircmd to save images of an online-book webpage. It would then send the page-down keystroke after successful screen capture and would to scroll down to the next page, then repeat the process until the entire book was saved as a series of number-ordered PNG images. At the time I knew how to save screenshots using batch files, but not how to send keystrokes, which required the controlling batch file to run a script on each pass to perform the page-down action. I was then able to combine the nearly 1000 images into my own PDF book and was spared from spending hundreds of dollars on a paper textbook for my class or purchasing the temporary online version of the book. In the end, it just made it easier for me to do my assignments without constantly ending up in my roommate’s room to reference his online book.
I used a small set of batch files hundreds of times in the creation of the video media found on this site. Creo Parametric, my 3D-modeling program, has animation tools built in that allow the user to export frames of an animation in either video form or as separate bitmap images in a sequence. Bitmaps are an extremely inefficient lossless file type, so each of the nearly 200 animation videos that I exported was made up of thousands of images which collectively took up between 10 and 30 gigabytes of space per video. My first remedy for this problem, as I do not have infinite storage space nor infinite time to wait for those files to load into editing software, was to write a small batch file whose function is to convert all image files in a folder, by parsing through and checking each file’s type, from bitmap or jpeg to png. The png format, especially when dealing with low-texture images, is very efficient at compressing file size, enough to bring each bitmap file’s size down by an order of magnitude. I made similar files by copying my code over which could be used to convert from any image format into bitmap or jpeg as well as png, should the need for such files arise.
I also used that convert code as a base for two other useful batch files that I used plenty of times in my media creation endeavors. When I was importing images into Adobe Premiere for the creation of my r/Place time-lapse as is mentioned above, I discovered that my video would take up nearly 15 minutes if I ran my 50,000 frames in a 60-fps video. I wished to cut this down by some factor, in my specific case a factor of two, so I branched off from my png convert batch file into a new file. This batch would parse through a folder’s contents in alpha-numeric order and choose every nth file, given by the preset integer n, to rename and export as a copy of the original file. This way, I could go through my 50,000 image files and export every other image, or every third or every fourth image and so on, to achieve the video rate that I desired without having to import all 50,000 images into my Premiere project and rely on its functions to speed or slow the video play-rate.
My other batch file, made as a branch off of the convert batch programs that I wrote, had a slightly different goal in mind which, like my convert code, took advantage of the command line image editor ImageMagick’s functions. I desired to systematically crop large sets of images with the same sub-array of pixels in every output image. Mainly this was necessary to quickly change screenshot images so that they no longer had window frames, program features or the task bar in frame. There were a few scenarios that I faced in Creo where I needed to export frames for videos in which a cross-sectional slice moved through a 3D object in order to better show its internal structure, particularly for my Rubik’s Cube and worm gear models, which is not a native function in Creo’s animation tools. Because of this, I had to write long strings of keyboard-mouse-combo macros to effectively control my computer at high speed and manually adjust the cross-section cut for each frame and then save screenshots to a folder to later be converted into a video. I ended up with a few thousand images in total between the cube and worm gear projects and desired to crop the usable area of the screen out to clean up the final video product. The crop batch file that I wrote parses through every image in a folder and sequentially crops it via controllable input constraints for height, width and x and y origin. Using this batch file, I prepared every image for import into Premiere and successfully exported my cropped videos without any quality loss.
Coding has an unbelievably large range of mechatronic and computational applications. I expect that efficient coding is an asset in society that will be necessary for a long time to come and is a tool that will become more commonly understood, used to fine-tune our daily lives. It has certainly had a positive impact on my life thus far, and my ability to problem-solve and create tools to help with otherwise tedious tasks.