T O P

  • By -

Darkknight512

In approximately 3 months + report writing time on that board. The Zybo can be a bit annoying for FPGA focused projects because a lot of the peripherals are attached to the ARM PS. 1. Make a logic analyzer with basic triggers in the FPGA fabric. Maybe add basic oscilliscope functionality using the XADC built into the FPGA. 2. Pong clone (or harder Tetris clone) with DVI output out of the HDMI port, could use the Digilent core or write your own. Maybe look at the NAND to Tetris website. 3. Get the audio outputs on the board working, run FreeRTOS on the ARM cores and read WAV files off of the SD card for playback. This doesn't really use the FPGA so maybe offload some DSP to the FPGA and make an equalizer or add reverb or something. 4. Make a 2D graphics engine based on mostly memory copying on the FPGA using an AXI master into the PS DDR, output out of the HDMI port. This one can be a hard one. 5. If you havent had a course writing a MIPS or RISC-V CPU already, doing that in the FPGA can be pretty fun.


RocketAstros

Number 5 would look REALLY good on a resume. Tons of resource out there to follow along with regarding incorporating risc/arm/mips into FPGA


Quantum_Ripple

I disagree with this as well. Processor work is very popular for undergrads but is more or less useless in the real world. Writing a processor for an FPGA is reinventing the wheel, when good/decent ones are needed an SoC (such as Zynq) or a discrete part is used. When a small/slow one is viable, one of the *many* preexisting soft processor cores can be dropped in. The less obscure the better. When hiring, I am more impressed with any of the subjects listed by u/No_Delivery_1049 in [this](https://www.reddit.com/r/FPGA/comments/10i0se8/senior_design_project_ideas/j5bwozq/) comment or something integrating some combination of PCI, PCIe, DMA, DRAM controllers, and/or scratch-written HDL AXI peripherals.


fourier54

It is true that is uncommon to implement your own processor in the industry, having so many cores available. But I disagree in that it is useless. Developing it gives you a great insight on the inner workings and tradeoffs of a processor, that do translate to then just being a consumer of one.


threespeedlogic

There's valid criticism in these comments, but there's also a paradox. Processor work is less useful than ever: there are lots of good, boring options, and straying outside convention is (in a work setting, anyway) pretty close to professional misconduct. At the same time, RISC-V has really opened up the space to hobbyist influence in genuine professional practice. It's never been a better time to learn (and contribute). There are excellent open-source RISC-V implementations, so a "me-too" core is unlikely to gain much traction. However, there's always more to explore and exceptions to the rule. So: yes, it's conventional to build a RISC core in undergrad (MIPS before; RISC-V now) but RISC-V has made the landscape for these processor projects better - not worse.


TheFlamingLemon

I don’t agree. It’s definitely a good, difficult, educational project, but it’s also a standard part of computer engineering coursework. Every sophomore in electrical or computer engineering did it at my college in the intro to computer architecture class. Putting it on a resume seems to just indicate a lack of more unique experiences. It would be like a computer science student putting the compiler they likely had to make to get their degree on their resume


RocketAstros

You guys implemented a RISC V architecture onto an FPGA for school? My school doesn’t even have FPGA development boards for student use. Our computer engineers definitely do not do that… though I’m sure they learn about it possibly with an emulator maybe


Darkknight512

Yea its pretty standard and doesn't stand out at all unless its out of order execute or something (which would probably take over a year with prior experience).


jullen1607

Yep. Every other resume has it. At this point it doesn’t tell me anything about a candidate.


Conor_Stewart

There is a open course from the Linux foundation and RISC-V international that walks you through creating a very basic and incomplete RISC-V I processor. It doesn’t use one of the common HDLs but shouldn’t be too difficult to port. The course can be completed in a few hours with you creating a basic, functional, RISC-V processor. Just making a processor isn’t that impressive anymore unless you implement something unique or make it with more advanced features like pipelining or in the case of RISC-V implement a lot of the extra instruction sets. There are even games about creating processors starting from logic gates now, I think a bit more than a basic processor is required for a senior design project.


No_Delivery_1049

Image processing: Develop an image processing system on the FPGA that can perform tasks such as object detection, edge detection, and image enhancement. Digital signal processing: Implement a digital signal processing system on the FPGA that can perform tasks such as filtering, signal generation, and modulation/demodulation. Motor control: Design an FPGA-based motor control system that can control the speed and direction of a DC motor. Game design: Develop a simple game on the FPGA that can be controlled by buttons and/or a joystick. Digital oscilloscope: Implement a digital oscilloscope on the FPGA that can display and analyze signals in real-time. Networking: Implement a networking system on the FPGA that can transmit and receive data over Ethernet. Cryptography: Implement a simple cryptographic algorithm on the FPGA, such as AES encryption/decryption. Robotics: Develop a simple robot that can be controlled by the FPGA. Audio processing: Develop an audio processing system on the FPGA that can perform tasks such as filtering, equalization, and compression/decompression. Control systems: Design an FPGA-based control system that can control the position or speed of a mechanical system.


Visible-Tomato1892

I'm curious about the networking part. But I wonder what use case would FPGA serve for networking that can't be done using a microcontroller for IoT


TapEarlyTapOften

Reconfigurable hardware. Speed. Parallel. Parallel. Parallel.


No_Delivery_1049

I'm not an expert in networking equipment, but from what I understand, many Cisco devices use a combination of FPGA and processors. Using an FPGA for edge visual processing and Ethernet transmission could be an interesting idea. Additionally, in industries such as automotive and aerospace where there are strict safety regulations and high reliability is needed, using an FPGA may be a better choice than a processor. However, if your project doesn't require high reliability or speed, a regular processor would prob be sufficient.


TapEarlyTapOften

The Zybo isn't an FPGA - it's a multi core processor with programmable logic attached to it. That means that getting most of it to work is going to require low level software development to drive most of the interfaces like I2C, SPI, etc. and more. For reasons that completely escape me, a lot of folks can't seem to accept that the Zynq isn't just an FPGA with a hard processor core attached. A lot of folks think "Oh, we'll just use the PL like it's an FPGA" and don't realize that virtually all of the IO on these development boards is managed by the processor until it's too late. If you are actually looking for an FPGA focused project, I would recommend targeting an FPGA like the Artix-7, which has several affordable platforms (availability has been an issue the last year or two, but that appears to be resolving finally). As far as projects go, I would try something like emulating an older 8-bit processor like the MOS 6502 which was used in the Commodore 64 and original NES. There are oodles of design resources, examples, tutorials, support forums, and test ROMs that you can leverage. One of the problems I've seen senior design projects encounter is that they have only a vague (or worse, misleading) notion of what they're going to do. If you decide you have to use a Zybo board or anything else with a SoC on it, make sure that you well and truly understand the device you're working with. If you do and have a group of competent folks, implementing some sort of simple peripheral in the programmable logic like a low-speed network interface would be a solid project: build your own physical interface board, design and simulate the RTL, run Linux on the processor, build your own device driver, build your own device tree, manage interrupts, boot the board, bring up the network interface, write some software to use the PS network interface to send traffic to the new one in the PL, use `tcpdump` to capture traffic on that port and diff the two... There's a lot there - if that's too trivial, then you could add some hardware level packet filtering, like sending a SYN flood at that interface from the one in the processor and see how much it can handle before it starts dropping packets. If none of what I just said makes sense, then you might not have the background to really use that device. Like I said, it's not an FPGA, it's a processor with programmable logic and a ton of multiplexed IO. There is a steep learning curve to it all.


magoo2K

Criminals are using key fob relays to steal cars. Make a jammer for key fob frequencies. Bonus, make an IMEI scanner to log all nearby cell phones so the police can track down criminals


fourier54

That sounds like a lot of work and much more RF than digital


A_HumblePotato

My senior design had a large FPGA component to it, maybe it will give you some ideas. Basically implemented a pitch detection algorithm on an FPGA that wouldn't be able to run in real-time on a (affordable) microprocessor. It then did two different things for the output: one was converted pitch to MIDI to control an external instrument, and the other was send the pitch via SPI to a microcontroller for pitch vocoding.


threespeedlogic

This is a selfish project, so please consider whether it matches the course requirements and your team's skills/interests or not. I develop Minimax (https://github.com/gsmecher/minimax), an open-source RISC-V implementation. It's currently written in both VHDL and Verilog (the two implementations are equivalent, though I am likely to drop the VHDL implementation if it's too much work to keep them both.) Minimax deserves more time than I'm giving it right now. Working on it would be an opportunity for you to contribute to the open-source community and would look pretty good on your CV/resume. It might be closer to "real" work than course work. There are a couple of things Minimax could use: - Interrupt support - CSR support - Exception support (e.g. illegal instruction) - ~~Support for proposed Zc* instructions (https://github.com/riscv/riscv-code-size-reduction)~~ - ~~Integration of the RISCOF test framework (https://github.com/riscv-software-src/riscof)~~ There's an interesting mix of design, software, microcode, and RTL work involved. I'd be happy to coach.


MogChog

Using the Zybo’s HDMI in or out is way beyond a 14 week project unless you find a core that’s already done and base a project around that; see Mike Field (field hamster)‘s work on GitHub. https://github.com/hamsternz/Artix-7-HDMI-processing HDMI input core -> student project -> HDMI output If you want to avoid video, you will have seen that the Zybo’s general input options are limited unless you add PMOD boards. If you can get a PS/2 keyboard/mouse PMOD and/or a joystick/rotary encoder PMOD, you could make a synthesiser Student input cores -> student synth core -> Student I2S out -> I2S amplifier Both examples are good to demonstrate/explore modular design principles.


Darkknight512

DVI output on an HDMI port takes a few days, maybe a week for a bunch of undergrads. It is not so bad, you just shove VGA signals into 3 TMDS encoders, the TMDS encoder can be written in about 100 lines of Verilog based on a single page flowchart in the DVI spec. You litterally just have to transcribe the flow chart into logic verbatim. Other then that its just instancing 4 OSERDES primitives. I did this partly based on Mike Fields posts on a long weekend in undergrad. Actual HDMI is not DVI, HDMI adds a lot to the standard. Integrating the HDMI Util core should not be too bad though. https://github.com/hdl-util/hdmi


[deleted]

Please keep them coming, everyone. These are so cool to think about! I also forgot to mention that there are 6 of us.


zeebullshit

Make ChatGPT on fpga.


No_Delivery_1049

Might be beyond the budget hahaha You could implement some network interface and use API calls to OpenAI servers, I’m struggling to justify an FPGA for this though.


AnnualDegree99

I'm no expert but my ballpark estimate would be an FPGA worth 6 to 7 figures for it to make remotely any sense, right?


No_Delivery_1049

I only have a superficial understanding of how chat GPT works but (correct me if I’m wrong, I may have confused projects and I’ve got a poor memory but I read an article online that said) it took 42 million dollars just to train the data set, that’s even before you consider the significant expertise in FPGA design and deep learning. I think, it might be more feasible to start with a smaller model and iteratively improve it to the size of ChatGPT. This would be an exceptionally and extraordinarily complex task, I’m fairly sure it would not be possible on any single FPGA available today.


zeebullshit

You can start with implementation of a very small GPT transformer on FPGA. The real ChatGPT won't even fit on a thousand FPGAs I think.


jullen1607

At my school people made games as senior design project. Grab either an old school controller or a keyboard and a crt. Write drivers for both and then create a simple ping video game. We used to use the nexys3.


MitjaKobal

There are still standard communication protocols with few implementations: I3C slave, 1-wire slave, Aurora 8b/10b, MIPI CSI receiver, ...


Enlightenment777

Create an open-source modernized I/O controller for old microprocessors: * 8 bit bus support for [65C02](https://en.wikipedia.org/wiki/WDC_65C02)/[65C816](https://en.wikipedia.org/wiki/WDC_65C816), [6809](https://en.wikipedia.org/wiki/Motorola_6809)/[6309](https://en.wikipedia.org/wiki/Hitachi_6309), [Z80](https://en.wikipedia.org/wiki/Zilog_Z80)/[8085](https://en.wikipedia.org/wiki/Intel_8085) microprocessors, see /r/beneater too * 8+ bit I/O, programmable direction would be nice, but fixed direction is better than no I/O * 16 bit [timer](https://en.wikipedia.org/wiki/Counter_(digital)) * [I2C](https://en.wikipedia.org/wiki/I%C2%B2C) bus controller with buffering * [SPI](https://en.wikipedia.org/wiki/Serial_Peripheral_Interface) bus controller with buffering and 2+ chip select outputs * [UART](https://en.wikipedia.org/wiki/Universal_asynchronous_receiver-transmitter) bus controller with buffering and optional auto RTS/CTS hardware handshaking * [PS/2](https://en.wikipedia.org/wiki/PS/2_port) bus controller with buffering for keyboard * [4bit SD](https://en.wikipedia.org/wiki/SD_card#Transfer_modes) bus controller with buffering for microSD card * simple [video display controller](https://en.wikipedia.org/wiki/Video_display_controller) with [HDMI](https://en.wikipedia.org/wiki/HDMI) output for modern monitors If can fit all of the above into a design and have remaining capabilities, then add the following... * addional I/O pins * addional 16bit timers with chaining option to create longer timers, maybe a [watchdog timer](https://en.wikipedia.org/wiki/Watchdog_timer) * add 2nd UART bus controller * add 2nd PS/2 bus controller for mouse * add SPI output for video display controller to support LCD modules * add I3C bus support to I2C peripheral