Automatic code analysis with Bistr

If you need to analyze the source code of a project, but want to automate this process and use the local power of your computer, the Bistr utility can be a great solution. In this article, we will look at how this utility helps analyze code using the Ollama machine learning model.

What is Bistr?

Bistr is a source code analysis utility that allows you to integrate a local LLM (large language model) model such as Ollama to analyze and process code. With Bistr, you can analyze files in various programming languages ​​such as Python, C, Java, JavaScript, HTML, and more.

Bistr uses the model to check files against specific queries, such as finding an answer to a question about the functionality of the code or a part of it. This provides a structured analysis that helps in developing, testing, and maintaining projects.

How does Bistr work?

  • Load state: When you start an analysis, the utility checks whether the analysis state has been saved previously. This helps you continue where you left off without having to re-analyze the same files.
  • Code Analysis: Each file is analyzed using the Ollama model. The tool sends a request to the model to analyze a specific piece of code. The model returns information about the relevance of the code in response to the request, and also provides a textual explanation of why the given piece is relevant to the task.
  • State Preservation: After each file is parsed, the state is updated to continue with up-to-date information next time.
  • Results output: All analysis results can be exported to an HTML file, which contains a table with a rating of files by relevance, which helps to understand which parts of the code are most important for further analysis.

Installation and launch

To use Bistr, you need to install and run Ollama, a platform that provides LLM models, on your local machine. The Ollama installation instructions for macOS, Windows, and Linux are described below.

Download the latest version of Bistr from git:
https://github.com/demensdeum/Bistr/

After installing Ollama and Bistr, you can start code analysis. To do this, you need to prepare the source code and specify the path to the directory containing the files to be analyzed. The utility allows you to continue the analysis from where you left off, and also provides the ability to export the results in HTML format for easy further analysis.

Example command to run the analysis:


python bistr.py /path/to/code --model llama3.1:latest --output-html result.html --research "What is the purpose of this function?"

In this command:

–model specifies the model to be used for analysis.
–output-html specifies the path to save the analysis results in an HTML file.
–research allows you to ask a question that you want to answer by analyzing the code.

Benefits of using Bistr

  • Local execution: Analysis is performed on your computer without the need to connect to cloud services, which speeds up the process.
  • Flexibility: You can analyze code in different programming languages.
  • Automation: All code analysis work is automated, which saves time and effort, especially when working with large projects.

Local neural networks using ollama

If you wanted to run something like ChatGPT and you have a powerful enough computer, for example with an Nvidia RTX video card, then you can run the ollama project, which will allow you to use one of the ready-made LLM models on a local machine, absolutely free. ollama provides the ability to communicate with LLM models, like ChatGPT, and the latest version also announced the ability to read images, format output data in json format.

I also launched the project on a MacBook with an Apple M2 processor, and I know that the latest models of AMD video cards are supported.

To install on macOS, visit the ollama website:
https://ollama.com/download/mac

Click “Download for macOS”, you will download an archive of the form ollama-darwin.zip, inside the archive there will be Ollama.app which you need to copy to “Applications”. After that, launch Ollama.app, most likely the installation process will occur at the first launch. After that, you saw the ollama icon in the tray, the tray is on the right top next to the clock.

After that, launch a regular macOS terminal and type the command to download, install and launch any ollama model. The list of available models, descriptions, and their characteristics can be found on the ollama website:
https://ollama.com/search

Choose the model with the least number of parameters if it does not fit into your video card at startup.

For example, the commands to launch the llama3.1:latest model:


ollama run llama3.1:latest

Installation for Windows and Linux is generally similar, in one case there will be an ollama installer and further work with it through Powershell.
For Linux, the installation is done by a script, but I recommend using the version of your specific package manager. In Linux, ollama can also be launched through a regular bash terminal.

Sources
https://www.youtube.com/watch?v=Wjrdr0NU4Sk
https://ollama.com

Unreal Engine on Macbook M2

If you were able to run Unreal Engine 5 Editor on a Macbook with an Apple processor, you may have noticed that this thing slows down quite a bit.

To increase the performance of the editor and engine, set Engine Scalability Settings -> Medium. After that, the engine will start drawing everything not so beautifully, but you will be able to work normally with the engine on your Macbook.

Fixing the mobile menu in WordPress


document.addEventListener('DOMContentLoaded', function() {
    new navMenu('primary');
    new navMenu('woo');
});

If you too have not had the iOS/Android blog menu open in your WordPress blog for several years, when using the Seedlet theme, then simply add:
In the closure function of the wp-content/themes/seedlet/assets/js/primary-navigation.js file, next to the default window addEventListener ‘load’ subscription.

Radio-Maximum-Electron

Radio Maximum Electron is a powerful and convenient application designed to listen to the radio station “Radio Maximum” on your computer running Windows, Linux and macOS operating systems. This player combines ease of use with high functionality, providing you with access to the stream in real time with minimal effort.

Just download the app from GitHub:

https://github.com/demensdeum/Radio-Maximum-Electron/releases

The author has no connection with Radio Maximum, he just really likes this radio.
The main functionality is implemented by the Nativifier project

https://github.com/nativefier/nativefier

The build scripts are licensed under MIT, the runtime has its own license!

Polar Bear’s Underwater Adventure

A simple game with infinitely generated mazes on ThreeJS.

Created as part of a 3-day game jam “Start the Game” on the theme “Family Game”.

A polar bear cub was walking on the ice with his mother when suddenly disaster struck – the ice cracked and he fell into the icy waters of the ocean. His mother was unable to save him, and the bear found himself in a mysterious underwater cave. To his surprise, he discovered that he could breathe underwater. There was only one way to escape from this trap – by overcoming the sea depths, solving riddles and fighting aggressive sharks, which could be fought off with well-aimed apple throws.

Now his goal is to find a way out of this underwater trap and return to his mother, overcoming the dangers of the sea depths and solving riddles.

https://demensdeum.com/demos/arctica/

Nixy Player

Nixy Player – A small, extensible, cross-platform JavaScript runtime.

Cross-platform: available on Windows, macOS, and Linux, as well as any other platform with C++ and dynamic library support.
Lightweight: minimal resource consumption with efficient performance.
Extensible: designed to be easily extended with plugins and additional libraries.

Please visit the Releases page to stay up to date with the latest releases and updates:
https://github.com/demensdeum/NixyPlayer/releases/

Raiden Video Ripper

Raiden Video Ripper is an open source project for video editing and format conversion. It is built using Qt 6 (Qt Creator) and allows you to trim and convert videos to MP4, GIF, and WebM formats. You can also extract audio from videos and convert it to MP3 format.
Интерфейс RaidenVideoRipper

Still from COSTA RICA IN 4K 60fps HDR (ULTRA HD)
https://www.youtube.com/watch?v=LXb3EKWsInQ
Please visit the Releases page to stay up to date with the latest releases and updates:
https://github.com/demensdeum/RaidenVideoRipper/releases