I've worked in few companies, about 10-100 people each, often cooperating with other teams, often in another city or country. Roles of programmer analyst and programmer. In consultant role i never did at all.
Process looked like this:
If company had customer(s), it could survive.
Role names might not be accurate, and depending on company size, some people could do well in few roles at once.
Main Architect, or Technical Leader was responsible for whole project from technological viewpoint, but still participated in meetings with clients.
There were talks between client(s) representative(s) and Analysts. Some documents (requirements and more) were made, it took a while.
Documents were passed to Analysts and Designers, who translated the requirements of Clients to technical talk, and abstract technical designs (in my opinion this should be contracts and automated tests).
Programmers, each of them understanding software analysis to some extent could communicate well with analysts. They did the implementation part. System which was developed on developer machines, each of them solving different project using similar (but not always) tools.
Periodically the system was committed (put into and more) into code repository (where everyone with access could store it or take it from there, in a safe and reliable way), not without clashes and problems.
Then it either went into integration (often programmers did the job of joining parts of code for integrator), or was passed to someone responsible for making sure that parts of whole system (car analogy: tires, windows, different spoiler variants etc...), the integrator.
Then tests, and feedback, bugfixing (correcting errors).
Each phase lasted a predetermined amount of time.
After succesful tests, system was moved from testing or integration environment to 'production environment', where it was considered 'live', and users could use it.
Then new iteration (cycle) began, and problems accumulated. No one wants coders to start always from scratch, but reality changes. To adjust code is much harder than to write it from beginning, as there are problems with understanding each others' code, especially if it's errorful, tangle of dependencies of kinds many programmers even not have clue about. If code has errors, it's often good to isolate such before solving. One at once is best. Automated tests help a lot in this case, for there are ways to ensure that code that breaks it is never allowed in repository (or someone responsible can be blamed, and we can revert to previous version), and this saves everyone stress and costs.
In the new iteration (development cycle in this case), analysts went to customers and got feedback, perhaps were paid, and change requests were debated. New words in dictionary, new technical designs. System to rebuild anew.
The more code would be reusable, the more quality it had, the easier it would be to manage changes and erosion which is cause of eventual fall of many projects.
Rest of the cycle was repeated, and programmers were put under time stress, more complex software, staff changes and different ways of thinking of new employees, at some point not many understood what problems they are solving, and how to not get fired for stupidity.