Holy shit this is incredible. I have wanted a way to permanently hide shorts forever, thanks for sharing. Also it’s actually recommended by Mozilla which means it has active security audits on it, impressive.
Holy shit this is incredible. I have wanted a way to permanently hide shorts forever, thanks for sharing. Also it’s actually recommended by Mozilla which means it has active security audits on it, impressive.
*Anecdote.
I doubt that was intentional, they would likely want to hide that latency but the CPU time required to scan everything just is what it is.
https://bsky.app/profile/filippo.abyssdomain.expert/post/3kowjkx2njy2b
The hooked RSA_public_decrypt verifies a signature on the server’s host key by a fixed Ed448 key, and then passes a payload to system().
It’s RCE, not auth bypass, and gated/unreplayable.
Ohh that makes way more sense, thanks. I haven’t used Debian in like 10 years but it was obviously the same back then too.
The slowness is on purpose? To help identify the sshd in question to the attacker which nodes are compromised? What reason(s) could there be?
They could be more like AMD in that regard, to answer your question:
Direct contributions to Linux kernel: AMD contributes directly to the Linux kernel, providing open-source drivers like amdgpu, which supports a wide range of AMD graphics cards.
Mesa 3D Graphics Library: AMD supports the Mesa project, which implements open-source graphics drivers, including those for AMD GPUs, enhancing performance and compatibility with OpenGL and Vulkan APIs.
AMDVLK and RADV Vulkan drivers: AMD has released AMDVLK, their official open-source Vulkan driver. In addition to this, there's also RADV, an independent Mesa-based Vulkan driver for AMD GPUs.
Open Source Firmware: AMD has released open-source firmware for some of their GPUs, enabling better integration and functionality with the Linux kernel.
ROCm (Radeon Open Compute): An open-source platform providing GPU support for compute-oriented tasks, including machine learning and high-performance computing, compatible with AMD GPUs.
AMDGPU-PRO Driver: While primarily a proprietary driver, AMDGPU-PRO includes an open-source component that can be used independently, offering compatibility and performance for professional and gaming use.
X.Org Driver (xf86-video-amdgpu): An open-source X.Org driver for AMD graphics cards, providing support for 2D graphics, video acceleration, and display features.
GPUOpen: A collection of tools, libraries, and SDKs for game developers and other professionals to optimize the performance of AMD GPUs in various applications, many of which are open source.
Am I the only one in this thread who uses VSCode + GDB together? The inspection panes and ability to breakpoint and hover over variables to drill down in them is just great, seems like everyone should set up their own c_cpp_properties.json && tasks.json files and give it a try.
I’m betting the truth is somewhere in between, models are only as good as their training data – so over time if they prune out the bad key/value pairs to increase overall quality and accuracy it should improve vastly improve every model in theory. But the sheer size of the datasets they’re using now is 1 trillion+ tokens for the larger models. Microsoft (ugh, I know) is experimenting with the “Phi 2” model which uses significantly less data to train, but focuses primarily on the quality of the dataset itself to have a 2.7 B model compete with a 7B-parameter model.
https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/
In complex benchmarks Phi-2 matches or outperforms models up to 25x larger, thanks to new innovations in model scaling and training data curation.
This is likely where these models are heading to prune out superfluous, and outright incorrect training data.
Doesn’t that suppress valid information and truth about the world, though? For what benefit? To hide the truth, to appease advertisers? Surely an AI model will come out some day as the sum of human knowledge without all the guard rails. There are some good ones like Mistral 7B (and Dolphin-Mistral in particular, uncensored models.) But I hope that the Mistral and other AI developers are maintaining lines of uncensored, unbiased models as these technologies grow even further.
While that is true, a lot of death and suffering was required for us to reach this point as a species. Machines don’t need the wars and natural selection required to achieve the same feats, and don’t have our same limitations.
I finally made the plunge to Linux desktop for all work in 2016 and have not looked back (and occasional windows VM, extremely rare now.) Even Arch is now perfectly fine as a workstation which surprised me. Recommend EndeavourOS to streamline the install process but it’s Arch underneath.
Is it possible for you to rephrase that comment? Don’t quite understand what you are getting at.
https://github.com/jdhao/nvim-config#features
Highly recommend this.
A modern Neovim configuration with full battery for Python, Lua, C++, Markdown, LaTeX, and more…
This is enough to get the intellisense and linters up and running. Only takes ~5 minutes to configure by installing prerequisites, it’s worth it though.
You mean the whole licensing ordeal? Retroactive type crap? I know a few developers personally that dumped it entirely because of that. Although I heard they backpedaled a little bit on that part because of the backlash, but the damage is done, trust is gone.
FF has way too much groundwork laid and way too much mindshare currently (especially given the rust language and all…) If, for some reason, thousands of devs just gave up on mozilla, more would continue the path and fork it most likely.
I could see myself implementing that via API calls into the app to write my own git repo out of the data. Not sure if joplinapp or any of these apps have APIs, but I would hope so.
I mostly ditched them many years ago because of privacy concerns (or lack thereof.) Around when I stopped using Dropbox too (same reason.)
Ahh, it’s been so long since I tried any nintendo emu. I just bought a new wireless gamepad, I should really try yuzu soon.
I agree generally. Here lately I’ve taken the plunge and compiled everything from source (Linux). While tricky on some, (dependencies mostly), the outcome is unusually stable. More stable than expected.
I don’t really have to fix anything in Linux, I do a lot of advanced things though (I’m a software dev) where I will manually change executables’ paths, swap them out with symlinks, use custom newer GCC compilers, etc, but even with all of that I still rarely ever have to “fix” anything. I have been waiting, prepared, for when this Ubuntu install craps out so I can finally wipe it out and switch to Arch for this PC… but it still keeps going and going without a hiccup.
I’m not sure what people are referring to that they have to fix all the time, but no two people have the same experience overall obviously, and there are so many variations of a linux system. like take 10 different desktop environments or window managers or different pieces of software or hardware and every permutation is going to have either more problems, or less problems.
Ultimately I would recommend anybody just giving all of the distros and DE/WMs a try. A good try, give it a few weeks and see how each of them feel, you’re not going to know what you’ve been missing, or if anything ever has bugs or quirks at all period, until you do.