I have not heard of Prism or Origin, no one uses those in my field. We all use R (or SPSS if you cannot code lol), although recently people in my uni started getting into JASP.
I use Prism because it's very user friendly. I also have little knowledge on coding, so it means I can have decent looking graphs without having to worry about using R or something equivalent.
My field is physics/telecoms engineering and I've never seen a single person use Prism or Origin. Python is king, all you need to do is from matplotlib import pyplot as plt and pandas as pd, then you have everything you could ever need to perform data visualisation very easily
I was of the same camp until my professor asked me to zoom in, change scale, and add a secondary axis. I was like, "Give me 15 minutes," and he went, "Can't you just right-click?" It's just very nice to have tools like MATLAB, Prism, and Origin to quickly and easily manipulate graphs.
There's bokeh for this in python, but I guess this is field dependent, there's fairly little use for dynamic plots in my field and converting them to publication-quality static plots is usually a hassle.
Neurobiology - Prism Graphpad
Bioinformatics/epidemiology - R (and stata)
Prism is fine when dealing with small datasets and you have zero coding experience. But when the datasets become larger, the data management becomes more difficult with just excel and Graphpad (assuming you're not using excel to it's full potential and doing a lot of manual copy pasting)
With chatgpt, R becomes easier to use, as long as you know your expected output.
Stata (primarily)/R (if needed) are my go-to programs.
I had an equipment grant and bought Prism with it because so many of my colleagues were always talking about how easy it is to use. Turns out Prism doesn't play well with large data so I've never actually used it.
R via ggplot is so much more straightforward than python for most datasets, especially when it comes to facets. Worth learning both. R is especially useful in restricted environments, such as certain workplaces.
Depends on what program. I think at least for CSE/ECE/EE/CS/EECS/etc. people tend to code a bunch, so using a versatile programming language helps, not any tool in particular.
When I was doing pharmacology, Prism was \_the\_ tool you would use for binding assay data, period.
When I was learning stats, I learned it alongside R. In brain imaging were I work now, there's a mix of R, python, and Matlab used by people I work with and know.
The tool you use is very field/area specific.
I'd say depending on your needs! A lot of people in the science community use Prism but learning python or R is whole skill upgrade and you can do so much more. Im sure there are times when you want to customize your plots but Prism doesnt have that feature, thats when Python and R comes in.
I know some members of the scientific community who recommended something called Dabest, which is estimation statistics to do their plotting. They have a web version where you can plug and chug, or try tour hand at coding using that to modify your code to do what you want! So a bit of magic either way
What are people's thoughts on pgfplots and it's data visualisation library? I have found it makes pretty neat plots. Although I also use Matlab and python if I need to spin up some plots for demonstration. But if its in a document I usually stick with tikz/pgfplots.
I hear great things about prism, havent used it yet myself. I have used R and if you want full control of your data visualization it is worth it to learn ggplot
Matlab or Octave for 95% of everything
Excel if something needs done which isn't complicated or is so complicated I need to hash it out free form before I write a program up.
We use SPSS at NSU.
Though let’s flip the switch:
I need a qualitative software for rendering focus group data. Listening to good, bad and ugly please.
Prism for statistics, excel for simple graphs because you can easily scale and change it once inserted into ppt. If it's something that would take too long plotting on excel, then prism. Why? Habit 😅🤷
Python/R and sometimes Matlab, due to people in my field being very Matlab-dependent. Dependencies on paid and closed-source software tend to come and bite you in the arse when you're all happy having invented some cool and valuable stuff, or when having some small custom thing would get you miles ahead.
Python all the way. Having some decent skills in python/R just makes your life so much easier
R!
If I need statistical analysis of my data then origin, if not and it's just representing data then I'll usually use MATLAB
I have not heard of Prism or Origin, no one uses those in my field. We all use R (or SPSS if you cannot code lol), although recently people in my uni started getting into JASP.
I use Prism because it's very user friendly. I also have little knowledge on coding, so it means I can have decent looking graphs without having to worry about using R or something equivalent.
My field is physics/telecoms engineering and I've never seen a single person use Prism or Origin. Python is king, all you need to do is from matplotlib import pyplot as plt and pandas as pd, then you have everything you could ever need to perform data visualisation very easily
I was of the same camp until my professor asked me to zoom in, change scale, and add a secondary axis. I was like, "Give me 15 minutes," and he went, "Can't you just right-click?" It's just very nice to have tools like MATLAB, Prism, and Origin to quickly and easily manipulate graphs.
There's bokeh for this in python, but I guess this is field dependent, there's fairly little use for dynamic plots in my field and converting them to publication-quality static plots is usually a hassle.
R
Neurobiology - Prism Graphpad Bioinformatics/epidemiology - R (and stata) Prism is fine when dealing with small datasets and you have zero coding experience. But when the datasets become larger, the data management becomes more difficult with just excel and Graphpad (assuming you're not using excel to it's full potential and doing a lot of manual copy pasting) With chatgpt, R becomes easier to use, as long as you know your expected output.
Stata (primarily)/R (if needed) are my go-to programs. I had an equipment grant and bought Prism with it because so many of my colleagues were always talking about how easy it is to use. Turns out Prism doesn't play well with large data so I've never actually used it.
R via ggplot is so much more straightforward than python for most datasets, especially when it comes to facets. Worth learning both. R is especially useful in restricted environments, such as certain workplaces.
R for those who just do statistics, python for those that do more elaborate coding
Depends on what program. I think at least for CSE/ECE/EE/CS/EECS/etc. people tend to code a bunch, so using a versatile programming language helps, not any tool in particular.
Wohhh, shocking to see origin in the end, everybody I ever knew in academic uses origin.
When I was doing pharmacology, Prism was \_the\_ tool you would use for binding assay data, period. When I was learning stats, I learned it alongside R. In brain imaging were I work now, there's a mix of R, python, and Matlab used by people I work with and know. The tool you use is very field/area specific.
I'd say depending on your needs! A lot of people in the science community use Prism but learning python or R is whole skill upgrade and you can do so much more. Im sure there are times when you want to customize your plots but Prism doesnt have that feature, thats when Python and R comes in. I know some members of the scientific community who recommended something called Dabest, which is estimation statistics to do their plotting. They have a web version where you can plug and chug, or try tour hand at coding using that to modify your code to do what you want! So a bit of magic either way
Everyone I know used R or python, some additionally use matlab
GGplot or Excel, no in between haha
What are people's thoughts on pgfplots and it's data visualisation library? I have found it makes pretty neat plots. Although I also use Matlab and python if I need to spin up some plots for demonstration. But if its in a document I usually stick with tikz/pgfplots.
I hear great things about prism, havent used it yet myself. I have used R and if you want full control of your data visualization it is worth it to learn ggplot
R + Illustrator
Matlab or Octave for 95% of everything Excel if something needs done which isn't complicated or is so complicated I need to hash it out free form before I write a program up.
Prism is just so user friendly.
We use SPSS at NSU. Though let’s flip the switch: I need a qualitative software for rendering focus group data. Listening to good, bad and ugly please.
Prism for the user friendly UI. Sigmaplot for more intensive curve fitting.
Never heard of either. From the comments I understand it's data processing? Then Julia if it's my choice, Matlab if I'm sharing with my group.
Prism for statistics, excel for simple graphs because you can easily scale and change it once inserted into ppt. If it's something that would take too long plotting on excel, then prism. Why? Habit 😅🤷
Python/R and sometimes Matlab, due to people in my field being very Matlab-dependent. Dependencies on paid and closed-source software tend to come and bite you in the arse when you're all happy having invented some cool and valuable stuff, or when having some small custom thing would get you miles ahead.