Terminal Editor:
I really prefer 'micro': https://micro-editor.github.io/
Graphical editors:
For a text editor, I really don't think anything beats Sublime Text.
It'a also nice to have a editor that works the same on any OS.
Visual Studio Code also works wonders -- remote editing is glorious.
Wasn't Fish shell running on rust or one of these fancy-fancy new programing languages? :-)
In response to some people delving into Linux and talk of various tools/editors/etc., here is some of the tools and setup I
always use on Linux boxes (and in some cases BSD, OS X, and even Windows boxes):
On Saturday, February 6th Arelor muttered...
Wasn't Fish shell running on rust or one of these fancy-fancy new programing languages? :-)
It's written in C++
In response to some people delving into Linux and talk of various tools/editors/etc., here is some of the tools and setup I always use on Linux boxes (and in some cases BSD, OS X, and even Windows boxes):
Modern replacements:
* ripgrep (rg): VERY fast 'grep' replacement with sane defaults
https://github.com/BurntSushi/ripgrep
* fd: Fast and modern 'find' replacement
https://github.com/sharkdp/fd
* bat: Colorlized cat with syntax
https://github.com/sharkdp/bat
* dfc: df replacement
https://github.com/Rolinh/dfc
* procs: ps alternative
https://github.com/dalance/procs
* exa: ls replacement for various circumstances
https://the.exa.website/
Other tools:
* fzf - fuzzy finder
https://github.com/junegunn/fzf
* jq - JSON query
https://stedolan.github.io/jq/
* broot - Director browser/etc.
https://dystroy.org/broot/
* Glances - a very nice process monitor
https://nicolargo.github.io/glances/
Great list. I use a few neat-o things, too:
bat - cat replacement @ github.com/sharkdp/bat lsd - ls replacement @ github.com/Peltoche/lsd procs - ps replacement @ github.com/dalance/procs zenith - top replacement / htop? @ github.com/bvaisvil/zenith hyperfine - command line benchmark tool @ github.com/sharkdp/hyperfine bpytop - htop replacement @ github.com/aristocratos/bpytop bandwhich - network bandwidth tool @ github.com/imsnif/bandwhich zoxide - cd replacement, install info @ github.com/ajeetdsouza/zoxide hexyl - freaking sick hex viewer @ github.com/sharkdp/hexyl
Nice! Many of these are on my list (including some I didn't post) as
well, but I haven't used bpytop yet. I'll have to check it out and
compare to Glances.
Bpytop is my favorite, but I hadn't heard of Glances... oh, the things you learn... :P Let us know what you thought about bpy.
|16Modern replacements:|07 |16* ripgrep (rg): VERY fast 'grep' replacement with sane defaults|07 |16* fd: Fast and modern 'find' replacement|07 |16* bat: Colorlized cat with syntax|07 |16* dfc: df replacement|07
I think, someone is also watching the DistroTube channel on Youtube :) He had a video about alternative/modern linux tools a couple days ago.
One downside of using such tools, is that are not default on newly installed linux distros. Even if you install those tools, someone should all ready know well, the default ones, in case he has to operate a fresh install.
I've not seen this - have a link?
and also every additional tool you add is an extra attack vector.
I've been around *nix systems since the 80s and somebody is always adding a new rewrite of the tools because of some supposed missing feature.
Imagine where, for example, OpenOffice (or whichever variant you
choose) would be if the developers were not split up across several different competing projects - OpenOffice, LibreOffice, Abiword, ....).
(7) systems would be more coherent from the install and also as has been mentioned more consistent as there is nothing worse than not having the same toolset on every machine. When I started in the Unix and Linux world Emacs had to be compiled from source to install which took time. Emacs was great it did loads but having to install on EVERY machine was a chore. Keeping Emacs' configuration in sync was a chore. And migrating customizations/scripts/dotfiles/... to every machine is enough trouble without adding tons of one off utilities --- I maintain a scripts library for building machines, useful scripts for administration etc. Learning how to combine tools and writing your own scripts proves to be far more beneficial in the long run than swapping out tools at every chance you have.
Imagine where, for example, OpenOffice (or whichever variant you choose) would be if the developers were not split up acrossseveral
different competing projects - OpenOffice, LibreOffice, Abiword,....).
This is how you get stale and stop innovation. Fragmentation has it's
pros and cons, but look at Linux as a whole: it won. It really did.
Linux is fragmented and everywhere and moving forward fast. Other
systems are also in the competition too -- Windows, OS X, BSDs, ... --
and we don't really need or want a single winner.
On 02-19-21 22:39, hal wrote to xqtr <=-
and also every additional tool you add is an extra attack vector.
I've been around *nix systems since the 80s and somebody is always
adding a new rewrite of the tools because of some supposed missing feature.
9 times out of 10 the feature is covered by using the '|' command and piping output into another tool. Each tool being fast and quick.
On 02-19-21 23:27, NuSkooler wrote to hal <=-
jq as another example has become a fairly standard tool. JSON is
extremely common so a standard tool dealing with it has emerged.
hal wrote to xqtr <=-
I've been around *nix systems since the 80s and somebody is always
adding a new rewrite of the tools because of some supposed missing feature.
9 times out of 10 the feature is covered by using the '|' command and piping output into another tool. Each tool being fast and quick.
When the GNU project really got underway there were many new developers who joined the GNU / Linux community who (1) didn't appreciate this and (2) decided that we must code every conceivable feature.
This resulted in a number of things that have hampered the open source community:
(1) we ended up with bloated GNU tools that are a lot slower than the original tools
(2) tools that are a lot more complex because of the dozens of
additional (mostly unneeded) features which makes it harder for people
to learn and results in more people wrongly using tools or creating
their own tools that do a limited subset
(3) a plethora of tools that all do essentially the same thing
(4) a massive waste of developer time spent each year that could be
spent to advance the tools we have or dream up tools that we currently don't have. Imagine where, for example, OpenOffice (or whichever
variant you choose) would be if the developers were not split up across several different competing projects - OpenOffice, LibreOffice,
Abiword, ....). If the developers learned how to work together better
then one of these projects would have buried microsoft years ago. The
same goes for many other tools in the open source community
(5) the user base would be more educated ... rather than trying out
dozens of different tools you learn one tool more deeply then another
and the whole user experience is deepened.
(6) maybe even our existing tools might be better/deeper/broader documented
(7) systems would be more coherent from the install and also as has
been mentioned more consistent as there is nothing worse than not
having the same toolset on every machine. When I started in the Unix
and Linux world Emacs had to be compiled from source to install which
took time. Emacs was great it did loads but having to install on EVERY machine was a chore. Keeping Emacs' configuration in sync was a chore.
And migrating customizations/scripts/dotfiles/... to every machine is enough trouble without adding tons of one off utilities --- I maintain
a scripts library for building machines, useful scripts for
administration etc. Learning how to combine tools and writing your own scripts proves to be far more beneficial in the long run than swapping
out tools at every chance you have.
I'm not saying that you shouldn't use these tools but my advice is to learn the standard tools more deeply first.
NuSkooler wrote to hal <=-
This is how you get stale and stop innovation. Fragmentation has it's
pros and cons, but look at Linux as a whole: it won. It really did.
Linux is fragmented and everywhere and moving forward fast. Other
systems are also in the competition too -- Windows, OS X, BSDs, ... --
and we don't really need or want a single winner.
I personally hate the argument that opensource developers need to do this, and that. Focus on one thing. Stop fragmenting and doing your own thing. If some one wants to tell an open source developer what he or she should be doing, then they should pay them.
There are plenty of developers working on opensource software that are paid to do so, and I am sure they do what they are paid to do, but those who aren't can do what ever they want with their spare time.
It's actually kinda really rude.
That's one thing I've always liked about Linux and similar environments. It's a pity people are forgetting the heritage.
I find jq useful for scripting. This is one where a new tool to suit current environments makes sense.
I'm not saying that you shouldn't use these tools but my advice is to learnthe standard tools more deeply first.
FWIW there are a lot of OSS developers NOT getting paied to develop
and contribute to software that is sold as well. From Linux distros
kernel up to usperspace apps, OS X, half the stuff AWS and friends run
on, etc. are MOSTLY written by people in their spare time. It's a
hobby in a way but also a second job in another.
Have I told you today how much I appreciate that you do shit like
build your own BSD distro :)
hal wrote to poindexter FORTRAN <=-
I still don't regret learning vim in the 80s as everything is familiar
and capable. I still like the idea of Emacs but I am way more effective
in vi or vim and the muscle memory is still there despite not being employed as a *nix system admin since 2001.
hal wrote to poindexter FORTRAN <=-
When I got my first Raspberry Pi I played around with Plan 9 and found
the simplicity intoxicating
I'm not against picking your favorite tools ... but it seems these days it is where nearly everybody goes first, whereas learning the core tools first (even the expanded set in the main distributions) will benefit people more.
Yep exactly. Although I think at least some of those companies contribute back in other ways, Apple's contributions to webkit and cups for example.
Farmers who have a passion for farmers don't give away their vegetables for free.. but although opensource developers have a choice, if they do try to monetize their work, you can bet there will be another alternative pop up because someone is pissed off that they have the audacity to try and earn something.
Lol. Thanks. It's totally pointless, but if you're interested I have it up at quinnbsd.org. I've made a bit of a mess in base as I'm learning as I go, but it works ok :)
IIRC Apple has been paying a single developer (mostly) for CUPS for
years. But yeah companies certainly do contribute back, but "the house always wins" as they say and they are coming out on top with a lot of
free work. When the work goes back to the community this isn't too
bad, when it's slurped up and commercialized it kind of sucks (though
within their rights with e.g. a BSD license).
I'm interested in so much that I think it's awesome (and may try a
image at some point), but I have so many projects my head is spinning.
NuSkooler wrote to hal <=-
It gives me a bit of a chuckle to see you guys talk about learning the "basic" and "core" tools, then go on to talk about your favorite
editors and the like.
I'm curious what you think the core tools are?
Dmxrob wrote to Xqtr <=-
2. We are finding that cloud spend is killing budgets. From large organizations we are seeing some SMALL movement to put a few things
back in the data centers - and I think containerization is going to be huge here. You simply cannot afford, for many organizations, to go
crazy in the cloud. Everything costs. When a business starts spending more on the IT than they do R&D on their core products, something is
very wrong.
3. Businesses are suffering from re-inventing the wheel, and not
knowing what is going on. A person leaves, all of a sudden a critical system doesn't work because he/she was the only person who knew about
it. We spent 6 months of human resource capital to develop a new
system in WhizBang v2.0 and it brings exactly $0 of extra revenue to
the company, etc. What's more WhizBang 2.0 requires 4 IT resources to maintain it, whereas WhizBang v1.0 only required 2.
I still don't regret learning vim in the 80s as everything is familiar and
Could never get a handle on vi/vim... already had to many ingrained wordstaresque commands in the ol' grey matter... It was like talking
to an alien..
`07*** Quoting Spectre from a message to hal ***`07
Could never get a handle on vi/vim... already had to many ingrained wordstaresque commands in the ol' grey matter... It
was like talking
to an alien..
I was recently on a support call doing an upgrade on some of our linux VM appliances used for backups. I had to log into t
and modify some configuration files and vi was the only thing available.
At that point I just gave control to the guy on the other end of the phone and said "go nuts". He chuckled and said that happens a lot.
Jay
... The cause of problems are solutions!
I was recently on a support call doing an upgrade on some of our linux
VM appliances used for backups. I had to log into them and modify some configuration files and vi was the only thing available.
At that point I just gave control to the guy on the other end of the
phone and said "go nuts". He chuckled and said that happens a lot.
Basic Vim is easy to grasp :-) If you run into a heavily modded and
Ahh no, it might be i for insert, but how the hell do you finish?
esc/esc or ZZ
(4) a massive waste of developer time spent each year that could be
you choose) would be if the developers were not split up across several different competing projects - OpenOffice, LibreOffice, Abiword, ....).
Sysop: | sneaky |
---|---|
Location: | Ashburton,NZ |
Users: | 31 |
Nodes: | 8 (0 / 8) |
Uptime: | 49:32:41 |
Calls: | 2,096 |
Files: | 11,143 |
Messages: | 950,056 |