Email or username:

Password:

Forgot your password?
Top-level
Julia Evans

especially interested in what helped folks specifically get over the fear of accidentally deleting their files / doing something destructive on the command line. so far folks have mentioned:

- get started on a raspberry pi / other “safe” environment
- use a fancy prompt that tells you the current directory

(again, only looking for replies from folks who got over this fear relatively recently)

66 comments
kf

@b0rk what if I’ve been doing infrastructure for 10+ years and still haven’t gotten over this fear? asking for a friend 😂💞

kf

@microwavenby I will say that in 10 years, I have yet to accidentally delete something, so depending on your perspective: either the fear is working or it is unwarranted 😂

Adam Williamson :fedora:

@kf
I did! And now I'll never see my emails (and misc documents and things) from 1992 through 2005 ever again...

So, still got that fear, hah. Look, "sda" and "sdb" are *very similar*
@microwavenby

rf

@kf @b0rk uh, not unrelated, we have servers where the prompt has the server's name and role (e.g. "test", "prod replica"), and some potentially dangerous ops go through wrappers that only let you run them on boxes without a prod role (and sometimes check that other things make sense)

Matt

@b0rk put EVERYTHING in git, so I can just revert/reset --hard HEAD to undo whatever mess I just created

Julia Evans

@angermcs oh cool idea! what do you mean by EVERYTHING? (like do you put all your dotfiles in git? how does that work?)

Adam Williamson :fedora:

@b0rk
I've seen some people who actually put the repo on github. I don't know exactly how you go about conveniently applying the files to the right location on a system, though...maybe there's some tooling for it?
@angermcs

Justin Browne

@adamw @b0rk gnu stow is a good tool to use for this. You can create a repository for just the stuff you want to backup, and restore it all with a `stow -t ~ git vim bash rust ...`

⚡️

@adamw @b0rk @angermcs i do this - dot files on git

i use it to sync .zshrc, .ssh/config, .gitconfig, .tmux.conf, Brewfile, and a few other app configs

it's one of the reasons i feel more comfortable experimenting with changes on my shell and git configurations lately

having the files publicly on github is just for sharing with others; i like to see what others are aliasing or writing functions for

⚡️

@adamw @b0rk @angermcs to apply the files i do the following

1. git clone --bare <repo url> .dotfiles
2. alias dotfiles='git --git-dir=$HOME/.dotfiles/ --work-tree=$HOME'
3. dotfiles checkout
4. source ~/.zshrc

1. creates a bare clone
2. creates an alias for working with you dot files repository - good idea to also put this same alias in .zshrc
3. actually checkout the files from the repo, if you have issues here you may need to stash before checking out
4. source the shell configuration

Nicholas Weaver

@b0rk @angermcs
ln -s .{dotfile} /archive/nweaver/doftfiles/{dotfile}

wizzwizz4

@angermcs @b0rk I have been doing this for a couple of years, and it has only just occurred to me that this doesn't protect me from deleting the .git directory.

Nicholas Weaver

@b0rk
I know this isn't recent (been doing command line for decades), but I still have the
alias rm rm -i
in my shell setup.

groxx

@b0rk I tried saferm and I still think that should be the default everywhere...

... but I've more come to terms with the paranoia. Care is fundamentally necessary, being paranoid about it is *correct*. Most tools are well-behaved by default, but some are designed to let you shoot your foot if that's what you want.

groxx

@b0rk I assume no, because it doesn't exist in everything I connect to, but yeah - I pretty reliably forget about it for years at a time, but I do make sure to add it whenever I start up something new. I've got a check for it in some of my reused dotfiles setup scripts, for example.

groxx

@b0rk it has saved my bacon once, and I managed to ctrl-c early enough on another machine's `sudo rm -rf ~` to not lose anything important. Hence: paranoia is healthy.

It really should be the default. It's much too easy to accidentally do that in scripts with bad string handling, and the few legitimate uses are so rare and so special that "run /bin/real_rm` by hand" is entirely reasonable.

groxx

@b0rk otherwise, my general tactic, and what I recommend to lots of people who want to learn computers: get a burner laptop for like $100. Learn how to wipe it and start over. Then go wild. If there's no valuable data on it, it doesn't really matter what you do with it.

E.g. this is how I test backup software, because I don't want to hand my "real" data over until I think I can trust one.

People need playgrounds. Their sole machine with their entire life's worth of data is not a playground.

Julia Evans

@groxx did you use this playground approach yourself? did it help? what kinds of stuff did you do on your playground computer?

groxx

@b0rk backups and restore testing is a very big one - lots of backup software is easy to set up but awful to recover with. By that point they've already hooked you, they have little incentive to do much but take your rent money until you have a problem, because nobody wipes their main PC to test it.

I use it for new OS testing and sometimes trying things out for friends - resizing positions is a crapshoot much of the time, and a problematic install can hose the whole system, so yeah. Paranoia.

groxx

@b0rk sometimes it also serves as a trivial air-gap to use potentially-shady software - just transfer stuff on a USB and erase when done. If it works, yay. If not, the risk of harm is practically zero.

groxx

@b0rk and as to using it myself to learn: yep! It was by far my biggest and easiest "ok now it makes sense" stepping stone, and I still use it as such.

Plus, like, ever wondered what `sudo rm -rf /` actually does, how much it breaks, and how you can sometimes recover from it? I've done it a few times, it's interesting.

groxx

@b0rk there's always the "just run a virtual machine" option, but that's often more of a pain imo.
If I need a lot of test environments for something esoteric, sure (snapshots are great), but it's often quite far from being able to tell you "does this work with my hardware" or "what is the experience of this like".

Virtual machines are also definitely not something I can reasonably recommend to complete newbies. "Follow Microsoft's recovery instructions" is easy, translating to a vm is not.

@b0rk there's always the "just run a virtual machine" option, but that's often more of a pain imo.
If I need a lot of test environments for something esoteric, sure (snapshots are great), but it's often quite far from being able to tell you "does this work with my hardware" or "what is the experience of this like".

Jose Galaviz

@b0rk As a developer, committing constantly. I should never lose more than a couple days of work. As a user, backing up constantly. It no longer matters if I make a mistake on the CLI or not.

Yuki 膤 :heart_trans:​

@b0rk here's my tricks:

- use tab completion, if it doesn't tab complete (or if it does and it shouldn't because the file isn't supposed to exist yet) there's something wrong
- avoid using wildcards and relative paths, or use them as specific as possible
- test on backups first!
- avoid doing anything destructive at all, and use a GUI whenever possible, I don't want to discourage use of CLI but hey, if you can :)

Michael Dekker

My teenager is learning to use command line interfaces on the Minecraft server they operate. And they have used the command line to accidentally delete a huge amount of personal work.

I think the sandbox nature of the environment and the low but non-trivial stakes for errors is a great mechanism for safe learning of the tool—especially learning to make sure they know what a command does before using it, and to never rush the process of executing untested commands.

kate

@b0rk as someone who recently (last week) accidentally deleted my home dir due to a cmd line typo, the thing that's helped me is having everything in an automatically backed up, easily restorable form. nuke the os and start again, restore to a restore point, or something else super easy.

CCC Freiburg

@b0rk cant remember the toolname yet - but basicly it forces you to enter the name of the mashine if you do something like reboot, shutdown , etc.
it is relative popular among some of us.
@3rz ideas ?

RyanSquared

@b0rk having a fancy prompt is SO important! I like to know what system I'm on, what permissions I have (user, whether I'm superuser, etc), the area I'm working in, and especially Git branch - that's the only one I don't explicitly mention when I enter a workdir.

Andrew Sterian

@b0rk daily automated backups. I like Arq.

Julia Evans

@steriana i just started using Arq and I love it, I somehow never figured out how to back up my system in 20 years of using linux on the desktop

ericjmorey

@b0rk

learngitbranching.js.org/
Was very helpful.

So was reading the first 7 chapters of git-scm.com/book/en/v2

I only read the first 3 chapters at first and then slowly read up to 7 intermittently.

duckalini

@b0rk I really struggle to read man docs, they just… don’t have enough examples to make sense to me. So anything that gives me another way of finding the info I need without a massive textbook was good. I have printed copies of all your zines, and I use ohshitgit all the time. Seeing example use cases described in words, and then shown via example commands helped me understand tremendously.

duckalini

@b0rk and when I find commands that really help me, i write my own explanation and copies of the commands. I used to Slack DM myself, but then I lose the history, so now I keep a Notes doc full of “work hacks”. It’s mostly git stuff, some ways to poke around and explore postgres and Rails databases, and I think a handful of awscli and terraform state commands.

duckalini

@b0rk not sure if my experiences count as new enough, I’ve been in infrastructure ~8-10 years depending on how you count. I still use these things daily tho, so 🤷‍♀️

Janne Moren

@b0rk
Up to date backups, and - as another person mentioned - keep everything you can in git.

It's easy to be fearless when you always have an undo button.

Edit: Besides, if you are seriously worried about accidentally deleting files, you should be just as worried about a hardware failure or software bug doing it for you. It probably means you don't have backups or don't trust them. That's the first thing you need to fix.

Paul_IPv6

@b0rk

my prompts still have both hostname and current dir, after i rebooted the wrong host twice, a gazillion years back.

some fears never quite go away but new folks should also know that command line is still a bit intimidating even to those of us who have used them for years. a good mix of confidence and caution will serve you well at any stage in your learning curve.

Richard Johnson

@paul_ipv6 @b0rk

I've been doing this 40+ years. Still not over the fear. `alias rm='echo "Use the GUI, Luke"'` helps a bit. I can `unalias rm` after some thought if I really need to live.

Rachel Rawlings

@b0rk I've never quite gotten over the fear, despite being a 40-year command line veteran and the kind of person who looks back over her shoulder asking "What?" to the complaining, cowering angels.

For a while I had a piece of label tape on my monitor that read "Read twice, Enter once."

But nightly automatic backups are always your friend.

Julia Evans

so many great answers in this thread, appreciate you all

Paul_IPv6

@b0rk

thanks for all the blogs/threads you start.

great info for folks old and new. getting back to solid grounding in fundamentals and understanding what lies beneath is always a good thing.

Mark Wolfe

@b0rk Containers have been really helpful for both learning and protecting people from themselves. I personally use them as a sandbox for testing commands and tools before running them in an environment. This is something I also encourage others to do as it pays to try things out!

${jndi:blit32 💻

@b0rk @wolfeidau three things:

1- Good backups
2- Move to trash/tmp instead of deleting
3- Still have some fear

CleoQc aka Nicole 🦜🌈🧶🐍🍁 :mstdn:

@b0rk
To be honest, I never got over this fear unless I'm on a Pi, where I know I can re-image the SD card. But on a server? I still freak out when I have to copy files. I'm double or triple checking every potentially destructive line, then explaining it to a rubber ducky, before hitting enter.

Julia Evans

@CleoQc not clear where the line is between reasonable caution and paranoia (we all need a little paranoia on the command line!)

CleoQc aka Nicole 🦜🌈🧶🐍🍁 :mstdn:

@b0rk
I'm definitely paranoid when I'm on the production servers. Doesn't matter how many years of experience. That dread feeling isn't going away.

Russell Davis

@CleoQc Just do it. YOLO!

I bet you read manuals too

Kyle Worthington

@b0rk A prompt that tells me the current working directory and autocomplete (such as fish), redundant & robust continuous backups and just gradual practice with using the command line to do weird things over time. Simply knowing the working directory (and regularly listing its contents) does wonders for confidence. I’ve occasionally asked someone who is more skillful than I am to do sanity checks on my commands when they became complicated. Went from just not using CLI to almost living in it.

RSNikhil

@b0rk Place 'alias rm="rm -i"' i .bash_aliases so 'rm' prompts for confirmation. Can explicitly override with -f.

Karl

@b0rk Using said safe environment to actually do said destructive things.

As for file destruction, I have successfully carved data back from the bare bytes of a drive whose index I fried beyond recoverability. It wasn't fun but knowing I could get (some) stuff back was a relief.

Chobbes

@b0rk is “almost nothing is that important” a good answer? I’ve talked to some people who have blown up homework assignments and I usually just tell them “it’s not actually as bad as you think. You didn’t actually lose all your work because now you know how to do it and you can write it again much faster and even better now”… which is maybe cold comfort at first but people usually tell me I was right after they fix it!

The other thing is just having more familiarity with the tools so you know what’s dangerous and what’s not… And when something seems sketchy you know to make a snapshot / backup beforehand. Better yet, you already have backups of everything important so one command on one computer can’t ruin everything. Everything I really care about is in git repos that are replicated already, so I’m never *too* worried.

@b0rk is “almost nothing is that important” a good answer? I’ve talked to some people who have blown up homework assignments and I usually just tell them “it’s not actually as bad as you think. You didn’t actually lose all your work because now you know how to do it and you can write it again much faster and even better now”… which is maybe cold comfort at first but people usually tell me I was right after they fix it!

Kim van Wyk

@b0rk when I have to remote in to do some potentially destructive work on a Really Important server I set my terminal background colour to the angriest red I can find. It doesn't fully solve the fear but it does remind me to be even more careful than usual.

TheDbof :verified:

@b0rk since no one seems to have mentioned it before:

I aliased "rm" to "trash-put" from the "trash-cli" Linux package, which moves deleted files into a "trash can" (hidden directory in home dir) where all or individual files can be restored.

Did this after wrongly deleting files wrongly too often, and significantly reduced my own anxiety when using rm.

man.archlinux.org/man/trash-pu

Go Up