I just joined a new team (very small: four developers in total, with two of those leaving soon). The two original developers set up the git repo on a folder in a Windows network share.
Am I taking crazy pills, or is that a bad idea? Our organization does have github/gitlab/bitbucket available, so is there any good reason not to use those hosted solutions?
Still a better version control than
20210131_v0.2_test2_john
20210131_v0.2_test2_john_final_final2_final_real
Are you concerned about corruption due to multiple users? Are you using the repo in the intended way? Then it’s fine. Git has locking mechanisms. Pull, work, commit, push.
I can’t exactly put my finger on it, but something feels off. For example, on my first day, I wasn’t able to view the files in Windows Explorer (or clone the repo, actually), so the other dev just gave me a zip file of the repo. There’s something fishy going on, and I’m trying to figure it out.
Since it’s on a network share, there’s the extra overhead of managing the file system permissions. And you probably hadn’t received access at the point.
That probably is the case, but in my mind I’m also questioning if they’re backing it up regularly, what prevents someone from going in and deleting the files, etc.
Sure, let’s hope they have a backup policy in place for best practice. But also it is kinda decentralized anyway. Every dev is going to have their local repo, and that is essentially a backup.
Do you mean a bare repo you use as a remote and push/pull from or using the workdir straight from the share. The first would be strange, but kinda usable (don’t do this though), the latter is oh my god, get an other job category.
Working from the network share - I’ve worked on a project like this before, it was awful for developer experience. It took seconds to run a
git status
, it was so slow. Occasionally git would lock itself out of being able to update files, since the network read/write times were so slow. Large merges were impossible.The reason it was set up like this was that the CEO had the network share we were working off of set up to serve to a subdomain (so like, Bob’s work would be at bob.example.com), and it would update live as most hot-reloads will do. He wanted to be able to spy on developers and see what they were doing at any given time.
I have a lot of programming horror stories from that job. The title of this thread brought up a lot of repressed memories.
Yes, it’s definitely the former case, thankfully. Agreed that it’s strange, but it’s hard to put a technical reason behind it if I decide to push for hosting it somewhere better.
To be honest I’d start by asking them why it’s set up like that as diplomatically as possible. This might be a bad solution, but what pushed them to adopt it nevertheless might be an organizational peculiarity you don’t want to find the hard way.
I don’t think this is too bad, but the question here is why they set it up this way. Are there any restrictions like no SSH? Also, this would make it hard to clone from an off site location (for remote work).
my team had issues when IT accidentally changed permissions on the files inside a bare git repo located on a file-share. Otherwise it works okay as people clone and work locally. Not the best solution but we’re working around restrictions that makes this the easiest thing to do
Do you git clone from the windows share or do all just use the same share as the working tree?
The big difference there seems to be cloud, ie someone else’s computer, and company internal network. That is a big and fundamental choice: Will you allow (leak) your data outside of your control.
Doing anything with Windows shares is a bad idea technically, of course. But with git, every workspace is generally a full copy of the repository. All you need a shared one for is syncing work. And backups are always a sensible thing to have no matter how you arrange files. With that, it seems to be a low risk thing to me.
A bundle might or might not be safer than a repo, but either will probably work.
Reading some of the comment responses, it does sound like there’s an air of ineptitude or a long history of systems not working and devs not allowed to make them work, so they just try workarounds for everything.
Our organization does have github/gitlab/bitbucket available
Do you mean “cloud services”? Maybe your colleagues don’t want them there.
For PCI-DSS relevant code, we only use internal systems.
I don’t see how would this be compliant with literally anything.
I would have to agree on this, it seems rather odd if the code repo is confidential or classified to be shared on a Windows Share. The reason why we would use Git services (self-hosted) is so that we have multitude of security services/layers maintained by dedicated team of system administrators such as firewall, service update, data redundancy, backup, active directory and so forth.
I can see a scenario where people accidentally put classified repos or information that aren’t supposed to be shared on Windows Share where unauthorized users could view that repos.