I feel that being made aware of alternatives and the ways in which changes to different parts of the process can yield different results is a rather important part to the overall academic side of learning within photography, and is helpful in forming a mental roadmap of where to go on the technical side of things.
However, I personally would strongly encourage starting with a strong, consistent, and proven base line of a traditional "Manufacturer's suggested" workflow, and then sticking with that for a period of time.
If you aren't aware of the factors involved all along the way toward producing a final image, then it becomes far harder to make educated guesses about things when "something doesn't work", or if something seems a bit off from what you were expecting. Sticking firmly to a single start-to-end process with minimal variables allows you to build up to a set baseline to better judge things off of. After you can reliably bring finished photos to this set baseline of 'the standard', then you can start stretching your wings and poking at things to 'experiment' and fine-tune your work.
On the other hand you could always just throw the entire "care and precision" mindset to the wind, and keep throwing the dice to see what comes up. Jump every which way you can and see what happens. Could be trash, could be epic, could be something you never manage to do again in your lifetime. There will be plenty who look down their nose at you for doing it, but in the grand scheme of things there isn't much reason for you to care about other's opinion of how you choose to work. It is art, not a life or death surgery. (Assuming you're playing nice with health and safety and environmental regs. Please don't grab random chemistry and throw it together willynilly to 'see what happens'... That road leads to danger, lost eyebrows, or worse.)