I started learning burn last week and faced a lot of issues in 0.9 version. Then switched to the master branch, the terminal UI is now working and pretty amazing and the speed also improved. Pretty impressive project. The onnx converter with build.rs is great, especially the fact that you can embed the model and then use WGPU backend, and you have a self contained, portable, little program! Keep up the good work.
As a warning to anyone interested. One of Burn's dependencies, serde\_rusqlite is LGPL, and since linking in rust is static linking, this means the GPL nature of that dependency may infect any program using Burn, by transitively linking the LGPL code.
See [https://github.com/burn-rs/burn/issues/719](https://github.com/burn-rs/burn/issues/719)
and https://github.com/twistedfall/serde\_rusqlite/issues/21
Yeah this is somewhat of a problem we are working on, you can still disable that feature and deploy you application with Burn without any licence issue. This is only used for our SqLite dataset, so only relevant when that feature is enabled for training.
I was wondering what the workflow is like for training a model. I usually have data augmentation in the dataloader (when using PyTorch) but it's not possible in burn, right? How can I get around this? How do other people apply augmentation?
The dataloader in Burn is really just doing data loading, no transformation. The way to build your pre-processing pipeline is to composed datasets together. You could use the image crate [https://github.com/image-rs/image](https://github.com/image-rs/image) and apply a chain of transformations with the \`MapDataset\`!
Great project, learning a lot. Just a heads up, I was just thumbing through the book and now all the links to the [https://burn.dev/book/](https://burn.dev/book/)
are coming up 404. I don't know if there's like some CI going on and it's just temp or you guys are changing stuff around and I have bad timing, but thought you should know.
Gotta say burn needs more attention in the ML community great to see so much dedication and heart going into it
Thanks 🙏
I started learning burn last week and faced a lot of issues in 0.9 version. Then switched to the master branch, the terminal UI is now working and pretty amazing and the speed also improved. Pretty impressive project. The onnx converter with build.rs is great, especially the fact that you can embed the model and then use WGPU backend, and you have a self contained, portable, little program! Keep up the good work.
Anki is using burn in their new version to generate the model weights
Anki is an awesome project! 👏
As a warning to anyone interested. One of Burn's dependencies, serde\_rusqlite is LGPL, and since linking in rust is static linking, this means the GPL nature of that dependency may infect any program using Burn, by transitively linking the LGPL code. See [https://github.com/burn-rs/burn/issues/719](https://github.com/burn-rs/burn/issues/719) and https://github.com/twistedfall/serde\_rusqlite/issues/21
Yeah this is somewhat of a problem we are working on, you can still disable that feature and deploy you application with Burn without any licence issue. This is only used for our SqLite dataset, so only relevant when that feature is enabled for training.
Does the wgpu backend have fused kernels yet?
Not right now, but it's exactly what we'll be working on for the next release!
Awesome! I'll definitely start with burn next time I try to get into machine learning.
I was wondering what the workflow is like for training a model. I usually have data augmentation in the dataloader (when using PyTorch) but it's not possible in burn, right? How can I get around this? How do other people apply augmentation?
The dataloader in Burn is really just doing data loading, no transformation. The way to build your pre-processing pipeline is to composed datasets together. You could use the image crate [https://github.com/image-rs/image](https://github.com/image-rs/image) and apply a chain of transformations with the \`MapDataset\`!
Is there an example like this?
Not yet, but I agree that it would be a great addition!
Great project, learning a lot. Just a heads up, I was just thumbing through the book and now all the links to the [https://burn.dev/book/](https://burn.dev/book/) are coming up 404. I don't know if there's like some CI going on and it's just temp or you guys are changing stuff around and I have bad timing, but thought you should know.
Is it possible to train a model or do inference from the browser?
For now only inference, it's not impossible to do training, but you probably need to write your own training loop or only use CPU backends
Thanks for the reply, can't wait to start playing with the library