Posts

Showing posts from January, 2025

Wind Energy

To meet the needs of increasing energy demands around the world, people of various fields including environmental conservationists, engineers, material scientists, and aspiring innovators have worked collaboratively to achieve countless breakthroughs in renewable capturer s (semiconductors, electrical generators, etc.) efficiency, density, and waste-material reduction. R esult antly , solar panels, wind turbines, geothermal exchanges, and other renewable systems have become increasingly available to consumers as well as substantially decreased in price.   Although its first generation of global e lectric ity didn’t occur until 1978 and initial adoption was slow , wind h as become one of the fastest growing forms of energy throughout the 21 st century [1 ] . In 2000, 15276.96 total t erawatt- h ours of usable e nergy was generated with 5,809.34 TWh being derived from coal, 2,745.09 TWh from natural gas, 2,629.08 TWh from hydropower, 2,540.46 TWh from nuclear, 1,323....

The Ocean Cleanup

After centuries of synthetically producing goods, decades of waste materials have resulted and despite globally accessible methods for recycling with seemingly infinite reuse applications*, many (both individuals and companies) wrongfully discard such goods leading to pollution with chain reaction effects. Foremost, not all goods are created equally with some posing greater environmental harm than others. As it stands, plastics are of topmost concern followed by metals including aluminum, steel, and copper. (Although sometimes modified for industrial applications, glass naturally forms when magma or a lightning strike encounters sand. This results from high temperatures temporarily changing its physical state and therefore properties. Additionally, cardboard and paper products are primarily derived from trees and therefore consist of mostly carbon atoms). Following popularity in consumer goods, plastics have seen invasive species-like introduction (not having evolved together overt...

Inductive Charging

Since the invention of synthetic electricity, safe distribution over vast distances, long-term storage mediums, and output sources have been continuously refined over several decades. Following the rise of mobile computing throughout the early 2000’s, more people than ever are depending upon point-source charging for daily tasks including educational assignments, workplace efforts, navigation, emergency communications, annual taxes, and more. With the addition of constant use and current battery limitations, convenient power sources have become a common priority with year-over-year innovations. Most recently, inductive charging* (a method of wirelessly transferring energy from one source to another) has become increasingly popular among consumer smartphones, laptops, and tablets each using different identifiers as well as technology. Notably, MagSafe, Qi, Qi2, and Qi2 Ready can be found across most modern devices including Apple’s iPhone, MacBook, Watch, etc., Google Pixel/Tablet, an...

Quantum Computing

Often confused as the next iteration of binary-based computers, quantum computing serves as a separate method of digital calculation with unique use cases. Since their introduction in 1978, classical computers have commonly depended upon components including a mainboard, central processing unit (CPU), (if dedicated) graphics processing unit (GPU), random access memory module (RAM), long-term storage medium (HDD, SSD, etc.), input/output sources, power source, and enclosure [1]. While each has been developed and refined over time, they collectively allow for code execution with calculations necessary to display visual information. As the name implies, quantum computing exercises principles of superposition, entanglement, decoherence, and interference to calculate probability which results in dependence upon vastly different technologies to function. Qubits, like a combination of binary bits and the theory of Schrodinger’s cat, perform calculations with fixed ones and zeros as well as ...

Level 4 EV Charging

After gasoline-based automobiles first entered mass production over a century ago, industries devoted their focus to constructing refuel stations throughout the world and preceded to build other establishments around them. For many decades, this continued to be a priority and led to abundant gas stations, price competition, and economic dependence. Although the first electric vehicles debuted well before this time, limited electricity access in rural areas and greater costs ultimately resulted in a decreased demand with little to no production by 1935 [1]. Though the majority continued to favor internal combustion engines, interest shifted in the late 1960’s following increased fuel costs and resulted in EV prototypes which were further promoted by governmental legislation against petroleum emissions. By 2010, Toyota, Honda, and Tesla Motors had all entered mass production with consumer deliveries and more companies soon followed suit. In result, both public and personal chargers hav...

ARM Processors

As electricity is to the grid, transistors are to computers. Developed in 1978, Intel’s 8086 16-bit chip served as the first processor used in any personal computer and went on to establish the x86 architecture [1]. Established in 1985 by Acorn Computers, ARM (Advanced RISC (Reduced Instruction Set Computing) Machine) was a license design alternative which required higher level programming to interface with components but the code necessary to execute a task was far less and therefore allowed designs to be smaller and reduced manufacturing costs. Due to efficiency and limited enclosure footprints, Acorn’s design was implemented in 98% of mobile devices by 2005 predating releases of the Apple iPhone, Google Pixel, Samsung Galaxy, and more [2]. Today, 99% have adopted this architecture with further integration beginning in tablets, laptops, desktops, and smartwatches due to largescale success. While Acorn themselves manufactures very few processors, their designs are continuously ref...

EV Electric Motors

Invented in 1823 by Samuel Brown, the first internal combustion engine sparked new possibilities for industrial development and inspiration among other engineers [1]. From 1830 to 1836 his innovation moved canal water in England, but lost operation after waterways were shut down by the London and Croydon Railway Act of 1835 [2]. Due to its limited capabilities, Brown’s design was retired and refined for multiple iterations by others such as Etienne Lenoir, George Brayton, and Nicholas Otto.   Following the development of modern combustion engines capable of traveling long distances, mass production of gasoline-based automobiles soon followed. In 1908, Henry Ford released the widely affordable Model T convincing many that fossil fuels were the future of transportation [3]. While electric vehicles were invented well before this time, they had several drawbacks including slow recharging speeds, limited battery capacities, insufficient electric motors, expensive price tags, and more. S...

Multijunction Solar Panels

In 1983, solar was first recorded as supplying electricity for global usage and yielded less than 1 Terawatt Hour [1]. Exactly 40 years later in 2023, 4264 TWh was supplied, but met just 2.33% of total energy needs. While any is arguably better than none, many individuals and manufacturers have realized renewables at low respective production (to fossil fuel output) are not enough to stabilize the environment. To mitigate this, decades of efficiency studies, refinement, and integration have occurred with promising results. Although general design has remained quite similar with components (from outermost to innermost) including glass, an encapsulant, solar cells, (more encapsulant), back-sheet, junction box, and mounting brackets, semiconductors have undergone significant development. Currently, monocrystalline and polycrystalline panels are available for consumer purchase and average 22% conversion efficiency, but both have drawbacks [2]. As their names suggest, monocrystalline contai...

Silicon-Carbon Battery Advancements

Although batteries were first invented centuries ago, their advancements have not occurred in the same developmental arch as most modern technologies. When the first “wet cell battery” was invented in 1800, it was comprised of brine-soaked cardboard, copper, and zinc, but could not provide a sustained current over extended periods of time [1]. To resolve this, John Daniell began work on copper sulfate and zinc sulfate electrolytes with a functioning prototype completed in 1836. Due to its success and consumer feasibility, the Daniell Cell design entered mass production and was used for over a century in various household electronics. Over the next 23 years, additional improvements were made leading to the first lead-acid rechargeable battery in 1859 and became a universal standard until the testing of lithium batteries in 1958 [2]. Over this transition, countless other breakthroughs occurred including the invention of the dry-cell battery in 1866 by Georges Leclanché, nickel-cadmium re...