All posts by Technical Unicorn

Starting with Langchain

Once again, I am jumping into the pool’s deep end to learn a programming language. LangChain is a library that empowers developers to build applications using large language models (LLMs) and other computational resources. It makes managing interactions with language models accessible, chain together multiple components, and integrate additional resources, such as APIs and databases.

Here is a brief walkthrough of how LangChain works:

  1. Components and Chains: LangChain has several core concepts. Components are the basic building blocks of LangChain. They are responsible for processing input data and generating output data. Chains are sequences of components that are executed in order.
  2. Models: LangChain provides a generic interface to various foundation models. These models can be used for tasks such as question-answering, chatbots, and agents.
  3. Prompts: LangChain provides a framework to help you manage your prompts. Prompts are the input data that is fed into your LangChain application.
  4. Memory: LangChain provides a standard interface for memory, a collection of memory implementations, and examples of chains/agents that use memory.
  5. Evaluation: Generative models are notoriously hard to evaluate with traditional metrics. One new way of evaluating them is using language models to do the evaluation.

The Langchain can be loaded quickly using my favorite tool Azure Data Studio Manage Packages.

Python should already be installed at this point.

I’ve selected five great sites for tutorials about Langchain:

  1. GitHub – gkamradt/langchain-tutorials
  2. Getting Started — ?? LangChain 0.0.187.
  3. Getting Started with LangChain: A Beginner’s Guide to Building LLM
  4. Tutorials — ?? LangChain 0.0.187.
  5. LangChain: How to Use – Beginner’s Guide | Step-by-Step Tutorial.

LangChain is a game-changer in coding languages, offering developers an intuitive and efficient platform to build applications. With its blend of programming paradigms, LangChain empowers programmers to write clean, concise, and scalable code.

As LangChain continues to evolve, we can expect it to find its place among the leading coding languages, attracting developers with its power and ease of use. My journey begins here; how about you?

What Happens if you get hit by a Bus Technically Speaking?

Why is the question of personal well-being and mental health always directed toward those in the technology field? What if a different kind of accident occurs? Individuals working in technology often face this inquiry from management and auditors who seek to assess departmental risks. Comprehensive plans are devised, explicitly detailing measures to restore systems and services in case of their absence. All these efforts are aimed at safeguarding and fortifying the business.

However, should we not also prioritize the well-being of the bus driver in our own career journeys? Who exactly is this bus driver? The bus driver can represent anyone who jeopardizes your career, job, or role. It emerges unexpectedly and catches you off guard. The consequences can be severe and widespread. These individuals can be found within your professional network. Are you equipped with the knowledge to evade them before they cause harm? I will share the invaluable lessons I’ve gathered while traversing the landscape of Technology.

Discover a fresh perspective on wellness and mental health tailored for individuals in the technology field.

WSL can be your goto environment

You won’t always have access to the internet. YES, it’s true. Circumstances(outages) and bad WIFI often leave us grounded from the cloud. But there’s hope you to continue to stay creative and build in this space when you are offline called WSL(Windows Subsystem LINUX)

WSL provides developers and users a seamless environment for running Linux distributions directly on their Windows machines. Whether you’re a developer, a system administrator, or an enthusiast, trying WSL can unlock a world of possibilities. Let’s consider some of the benefits:

  1. Embrace the Best of Both Worlds:
    WSL enables you to harness the strengths of both Windows and Linux, creating a synergy that can significantly enhance your workflow. By bringing the Linux environment to Windows, you can enjoy the familiarity and usability of Windows while leveraging the power and versatility of Linux tools, utilities, and package managers. Whether you’re a fan of the command-line interface or need to run specific Linux applications, WSL lets you seamlessly switch between the two worlds, all within the comfort of your Windows environment.
  2. Streamlined Development Environment:
    For developers, WSL offers a streamlined and cohesive development experience. You can now build, test, and deploy your applications in a Linux environment without a separate machine or virtualization. WSL supports a wide range of Linux distributions, including Ubuntu, Debian, Fedora, and more, providing access to a vast ecosystem of development tools, programming languages, and frameworks. Whether you’re working on web development, data analysis, or machine learning, WSL empowers you to optimize your workflow by leveraging the power of Linux within Windows.
  3. Enhanced Compatibility and Interoperability:
    One of the most significant advantages of WSL is its ability to enhance compatibility and interoperability between Windows and Linux environments. With WSL, you can seamlessly access and modify files on Windows and Linux, eliminating the need for complex workarounds or file conversion tools. WSL also allows you to execute Windows executables directly from the Linux command line and vice versa, opening up new possibilities for cross-platform development and collaboration.
  4. Easy Setup and Configuration:
    Setting up WSL is a breeze, making it accessible to beginners and advanced users. With a few simple steps, you can enable WSL on your Windows machine and choose your preferred Linux distribution from the Microsoft Store. The installation process takes care of all the necessary components, including the Linux kernel, and within minutes, you’ll have a fully functional Linux environment at your fingertips. Additionally, WSL provides a command-line interface for managing your Linux installations and updating them effortlessly.
  5. Community and Support:
    WSL has gained widespread popularity and a vibrant community of users and developers. This means you’ll find abundant resources, tutorials, and support for issues or questions. Whether you’re looking for specific Linux software recommendations or troubleshooting advice, the WSL community is always ready to assist you. Moreover, Microsoft maintains and updates WSL, ensuring a reliable and stable platform for your Linux-based endeavors.


Windows Subsystem for Linux (WSL) is a game-changer, bringing the power and flexibility of Linux to the Windows ecosystem. With its easy setup, broad compatibility, and strong community support, WSL is a must-try for developers, system administrators, and enthusiasts looking to harness the best of both the Windows and Linux worlds.

https://learn.microsoft.com/en-us/windows/wsl/install

AWS Lightsail My Goto Cloud Server

I only needed to see this demo once, and I was hooked. The setup was seamless, quick, and accessible. The price point was just what I needed so that I wouldn’t get shocked with a BIG bill. The marketplace of options gives you just enough selections that you aren’t overwhelmed by options. All signs point to yes, and I have used it ever since. I always find it beneficial to highlight my reasons before I signup for a service, so these are the things that helped me. I am not a hardcore programmer, but I want to spend more time on the development than the setup.

One of the standout advantages of AWS Lightsail is its simplicity in both setup and management. With just a few clicks, you can launch a virtual server, select your desired configuration (including the operating system, CPU, RAM, and storage), and have your application up and running quickly. The user-friendly interface and intuitive management console make it easy to monitor your instances, manage networking settings, and access essential resources, even for those new to cloud infrastructure. AWS Lightsail abstracts much of the complexity, allowing you to focus on your application rather than the underlying infrastructure.

Other benefits to consider:

  1. Familiarity: If you’re already using AWS, then LightSail will be familiar.
  2. Simplified UI: The user interface is majorly simplified, making it easy to use for beginners. YES
  3. Speed: You can deploy a VPS within seconds, choosing any predefined plans Amazon offers.
  4. Scalability: LightSail is scalable and can grow with your business.
  5. Affordability: LightSail is affordable and offers a range of pricing plans to suit your needs.
  6. Ease of use: LightSail is easy to use and requires no prior knowledge of AWS.
  7. Security: LightSail is secure and comes with built-in security features.
  8. Flexibility: LightSail is flexible and can be used for various applications.
  9. Reliability: LightSail is reliable and comes with a 99.9% uptime guarantee.
  10. Support: LightSail comes with AWS support, so you can get help when needed.

AWS Lightsail empowers web developers, businesses, and individuals to deploy and manage their applications with ease, scalability, and cost-effectiveness. With its simplified setup and management, seamless scalability, reliable performance, and integration with AWS services, Lightsail offers a comprehensive solution for your projects. The lowest-priced server starts at $3.50; you can’t beat that.

AWS Lightsail

Azure Data Studio – Works For Me


As a data enthusiast and professional, I am always looking for powerful tools that can simplify my data exploration and analysis tasks. I wanted to share my experience working with Azure Data Studio, a comprehensive data management and analytics tool. It has become an invaluable tool in my data and content writing journey.

  1. Intuitive User Interface:
    Azure Data Studio boasts a sleek and intuitive user interface, making navigating and performing complex data operations easy. When I launched the application, I was impressed by its clean design and well-organized layout. The intuitive interface allows me to manage connections, explore databases, write queries, and visualize data effortlessly. The well-thought-out user experience of Azure Data Studio significantly enhances my productivity and makes working with data a breeze.
  2. Multi-Platform Support:
    One of the standout features of Azure Data Studio is its multi-platform support. Azure Data Studio provides a consistent and seamless experience across different operating systems, whether you are a Windows, macOS, or Linux user. Cross-platform compatibility empowers users to work with their preferred operating system, regardless of their data management and analysis needs.
  3. Robust Querying Capabilities:
    Azure Data Studio provides robust querying capabilities, allowing me to extract valuable insights from my data. With built-in support for Transact-SQL (T-SQL), I can write complex queries, execute them against databases, and view the results in a structured manner. The IntelliSense feature provides intelligent code completion, making query writing more efficient and error-free. Additionally, the query editor supports advanced functionalities like code snippets, code formatting, and query execution plan visualization, enabling me to optimize my queries and enhance performance.
  4. Seamless Integration with Azure Services:
    Azure Data Studio seamlessly integrates with various Azure services, creating a unified data management and analytics experience. Whether I need to work with Azure SQL Database, Azure Data Lake Storage, or Azure Cosmos DB, Azure Data Studio provides built-in extensions and features that facilitate seamless integration with these services. This integration enables me to leverage the power of Azure’s cloud services directly from within the tool, simplifying data exploration, analysis, and collaboration.
  5. Coding and Development The seamless integration of Python with Azure Data Studio allows me to leverage the power of Python libraries and frameworks for data analysis, machine learning, and visualization. The intuitive interface of Azure Data Studio, combined with the flexibility of Python, enables me to write and execute Python scripts effortlessly, making complex data tasks feel accessible and manageable. Whether performing data transformations, building predictive models, or creating interactive visualizations, the combination of Azure Data Studio and Python empowers me to explore and derive insights from my data collaboratively and efficiently.
  6. Extensibility and Community Support:
    Azure Data Studio is highly extensible, allowing users to enhance its functionality through extensions and customizations. The vibrant community surrounding Azure Data Studio has developed a wide range of extensions, providing additional features, integrations, and productivity enhancements. From query optimization tools to data visualization extensions, the community-driven ecosystem of Azure Data Studio expands its capabilities and caters to diverse data needs. The availability of community support and the collaborative nature of the tool make Azure Data Studio a vibrant and constantly evolving platform.

Azure Data Studio has transformed my data exploration and analysis journey with its intuitive interface, multi-platform support, robust querying capabilities, seamless integration with Azure services, and vibrant community. Whether you are a data professional, developer, or enthusiast, Azure Data Studio offers a comprehensive and user-friendly environment to work with data efficiently and derive meaningful insights. My experience with Azure Data Studio has been exceptional, and I highly recommend it to anyone seeking a powerful tool for their data management and development endeavors.

For me, it’s all about the Data!

https://learn.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio

Python and Azure Data Studio

I am unsure what drew me into trying Python Libary files like a shopper in a warehouse store, but I wanted to try them all. Especially the ones that produce graphic displays. There is a little extra effort to do this in Ubuntu, but Azure Data Studio manages it seamlessly. Visualizations are the best way to tell a data story; I try to use them whenever possible. I like to keep it simple, so here is a sample code in Python using Azure Data Studio.

Prerequisites

Python

Azure Data Studios

Add matplotlib package to Azure Data Studio

Insert the following code into a Jupyter Notebook or directly in Azure Data Studio

import matplotlib.pyplot as plt

x = [1,1,2,3,3,5,5,7,8,9,10,

     10,11,11,13,13,15,16,17,18,18,

     18,19,20,21,21,23,24,24,25,25,

     25,25,26,26,26,27,27,27,27,27,

          ]

plt.hist(x, bins=10)

plt.show()

These libraries are pre-written modules that provide ready-to-use functions, tools, and utilities, significantly enhancing the capabilities of Python. Here are some of the benefits of using Python libraries and how they can transform your coding experience.

  1. Accelerate Development: Python libraries eliminate the need to reinvent the wheel by providing pre-built solutions to everyday programming tasks. Whether you need to handle data manipulation, implement machine learning algorithms, or create visualizations, Python libraries such as NumPy, Pandas, and Matplotlib offer efficient and optimized functions that speed up development. Leveraging these libraries saves valuable time and enables you to focus on the core aspects of your project.
  2. Diverse Functionality: Python libraries cater to various domains and applications. From web development to data analysis and natural language processing to computer vision, a library is available for almost every use case. For example, TensorFlow and PyTorch provide powerful tools for machine learning and deep learning. The diverse functionality of Python libraries opens up endless possibilities for your coding projects.
  3. Code Reusability: Python libraries promote code reusability, a cornerstone of efficient programming. By utilizing libraries, you can leverage existing code and build upon the work of others. This saves time and encourages best practices and standardization across projects. Reusable code from libraries reduces errors, enhances reliability, and facilitates code maintenance. Python libraries act as a vast repository of solutions that can be easily integrated into your own projects, increasing productivity and reducing development effort.
  4. Foster Innovation: Python libraries provide a fertile ground for innovation and experimentation. These libraries often incorporate cutting-edge techniques and algorithms, empowering developers to explore new possibilities and push the boundaries of what can be achieved. You can unlock new insights and develop groundbreaking applications by harnessing the power of libraries like NLTK for natural language processing. The libraries encourage innovation and fuel the growth of the Python ecosystem.


Python libraries are invaluable for developers, offering many benefits that enhance productivity, promote code reuse, and drive innovation. The extensive range of libraries available enables developers to tackle diverse challenges efficiently and effectively. Whether a beginner or an experienced programmer, leveraging Python libraries empowers you to accomplish more with less effort, accelerating development and fostering creativity. Embrace the power of Python libraries and unlock the full potential of your coding journey.

Dynamic Views – The Storytellers of SQL Server

The successful management of data is an essential part of the role of an administrator. Although third-party tools can be helpful, SQL Server has some built-in functionality that can provide valuable information to most technical teams. As data changes over time, the data environment’s performance will also change. Administrators and Engineers rely on system-generated data within SQL Server and its management tools to resolve issues and monitor SQL Server’s performance. This discussion aims to examine the built resources and how they are used to manage the performance of SQL Server. Monitoring is one way to see changes over time. Visual graphs or charts can help spot trends or changes within the data.

Let’s briefly consider a few of the built-in features. SQL Server Management Studio has an Activity Monitor.  The Activity Monitor within SQL Server provides insights on the activities and processes running within SQL Server.  Activity Monitor Overview Pane displays the following data:

  • Processor Time
    • This monitor will show how busy the CPUs are and if it is operating at an acceptable level. 
  • Waiting Tasks
    • The number of tasks that are waiting for, I/O, or memory to be released so the tasks can be processed.
  • Database I/O
    •  The data transfer rate in MB/s from memory to disk, disk to memory, or disk to disk.
  • Batch Requests/sec
    • The number of T-SQL command batches received by the server per second.

Expanded View (activity is queried only when the panel is expanded)

  • Processes
    • Displays information about active running processes on the SQL databases; including login ID, database, command, and application name.
  • Resource Waits
    • Queries utilize a large number of resources – memory, disk, and network.
  • Data File I/O
    • Display the following information about database files in the SQL Server instance: MDF, LDF, and NDF, their paths, and names
  • Recent Expensive Queries
    • The pane shows expensive queries executed in the last 30 seconds
  • Active Expensive Queries
    • Displays a list of queries that utilize a large number of  resources – memory, disk, network.

Windows provides a Performance Monitor (PerfMon) to add customized counters related to SQL Server

The following types of counters are available for SQL Server:

    • Avg. Disk sec/Read
      • The average time, in seconds, of a read of data from a disk.
    • Avg. Disk sec/Write
      • The average time in seconds needed to write data to a disk.
    • Page Life Expectancy
      • How long in seconds is a data file page expected to remain in memory in the buffer cache? It is not based on averages.
    • Buffer cache hit ratio 
      • The percentage of pages found in the SQL buffer pool. It may not indicate a problem. 
    • Compilations/sec
      • Measures the number of times SQL Server compiles an execution plan per second. 

SQL Profiler is the GUI for SQL Trace. It can trace the steps and analyze queries and processes executed against SQL SERVER.  The results are saved in a file that can be replayed for a closer review. This process helps create a repeatable error based on another process.

All of these monitoring tools are useful but limited by the amount of data each can provide. In most cases, a closer look into the system data is needed.  User-created data and objects pose the most risk to any system. It is a known fact that the original schema, concept or design of data can change over time.  Improper or limited scaling, poorly written queries, data ingestion, and access can impact the database’s performance. The discovery of an issue means the technologist must become a detective as they try to discover the unknown issue with the known results or impact. 

SQL Server is designed to store data and related objects and is a collector of insights based on its running state.  The information records the activities performed and executed on the SQL Server since its last reboot. Unless this data is stored within a table, it is cleared whenever SQL Server gets restarted.  These insights are called Dynamic Views. A Dynamic View in SQL Server is designed not only to store data and related objects but also to collect information based on its running state.  The collection of information a record of the activities performed and executed on SQL Server since its last reboot. Unless stored within a table, these information repositories are cleared whenever the SQL Server is restarted.

Dynamic management views return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance.[i]  Database-based dynamic management views are the storytellers of the database. Each of these views can be used to tell the story of the data by diagnosing performance issues in SQL Server.

Understanding Dynamic Views

Dynamic views can play an essential role in troubleshooting bothersome performance issues.  Engineers and Administrators are often called upon to address these issues by using them. The results from dynamic views can be customized to contain only the data that you need.  VIEW DATABASE STATE permission is needed in SQL Server to see these dynamic views.

There are two types of dynamic management views and functions:

  • Server-based dynamic management views and functions. These require VIEW SERVER STATE permission on the server.
  • Database-based dynamic management views and functions. These require VIEW DATABASE STATE permission on the database.

The key to utilizing these views is selecting the appropriate fields. Most views contain numerous columns; generally, only a few are needed to troubleshoot most problems.

The Problem Query

The complexity of a query can impact performance.  A complex query is beyond a basic query with a SELECT statement against one table. A complex query can involve inner and outer joins against multiple tables. It can combine sub-queries, nested commands, AND, OR, and WHERE clauses.

The purpose of a query is to retrieve specific data from a single or joined table.  The data you want to see can be spread across several tables, and queries allow you to view it on a single page. The performance becomes a challenge when the data starts to scale. The schema can change; tables are no longer used, field names change, or more attributes are added.  As the data scales the indexes once used may no longer provide the same value. An upgrade or user-based changes can trigger these problems.

Queries are not intentionally designed poorly. Over time, the usefulness of some queries wears out.  Data often scales beyond the scope of what the query was initially intended. Query plans define the order and method of execution; each can be impacted if the underlying data or data objects change. Poorly documented changes within a database environment contribute to the pool of challenged queries.

Blocking and Locking in SQL

In SQL Server, data is accessed by applications, services, and users. Each process running in SQL has its own unique ID called a SPID. A SPID in SQL Server is a Server Process ID. These process IDs are individual sessions in SQL Server. Each time an application connects to SQL Server, a new connection (or SPID) is created.   The database engine issues locks to prevent simultaneous changes to the database. When another request comes in, this request is blocked until the data is released from the previous session — for example, a SELECT * statement without a WHERE clause has the potential to cause lock contention. A production environment can become problematic if multiple sessions are locked and blocked — one way to view activity to use the dynamic views.

In older releases of SQL Server, one of the more common stored procedures used to determine blocking is sp_who.  Now, Administrators are being referred to dynamic views as a key source of information.

There are at least four dynamic views that can be used to dig into this performance issue.  The dynamic view sys.dm_tran_locks shows requests that are actively running in the SQL Server.

The view sys.dm_os_waiting_tasks shows the queue of tasks that are waiting for SQL resources. The view sys.dm_exec_requests details information about each request. The view sys.dm_exec_sql_text returns the identified text SQL.

The example shown below are the results of a query[ii] using sys.dm_exec_requests and sys.dm_exec_sql_text.  In this example, the Microsoft database WideWorldImporters has two select statements running against the same table.  Until one statement completes its execution, the other select statement is blocked.

SELECT  session_id
      ,db_name(database_id) as 'Database_Name'
      ,sql_text.text as 'SQL RequestBlocked'
     ,blocking_session_id
FROM sys.dm_exec_requests
CROSS APPLY sys.dm_exec_sql_text(sql_handle) as sql_text
WHERE blocking_session_id != 0

In this example, a  script[ii] provided by Microsoft combines dynamic views to display the text of SQL queries that are being executed in batches and provides statistical information about each of query plans.

SELECT s2.dbid,
s1.sql_handle,
(SELECT TOP 1 SUBSTRING(s2.text,statement_start_offset / 2+1 ,
         ( (CASE WHEN statement_end_offset = -1
      THEN (LEN(CONVERT(nvarchar(max),s2.text)) * 2)
      ELSE statement_end_offset END) - statement_start_offset) / 2+1)) AS sql_statement,
         execution_count,
         plan_generation_num,
         last_execution_time,
         total_worker_time,
         last_worker_time,
         min_worker_time,
         max_worker_time,
         total_physical_reads,
         last_physical_reads,
         min_physical_reads,
         max_physical_reads,
         total_logical_writes,
         last_logical_writes,
         min_logical_writes,
         max_logical_writes
     FROM sys.dm_exec_query_stats AS s1;

The item highlighted in blue represent the timings on query plans and their execution. All timings are based on microseconds.  Individual lines of SQL can be taken from this query to view what is being executed.  The code can be analyzed carefully by viewing the SQL text to determine if is impacting the system or the business process. It is also helpful in executing it outside of the application to debug it or trace the activity.  The query results can help determine the root cause of performance issues when the data is the culprit or the victim.

A deeper look into these stats with the following query will provide the top five expense queries for review:

Expensive Queries

SELECT TOP 5 total_worker_time/execution_count AS [Avg CPU Time],
                       SUBSTRING(st.text, (qs.statement_start_offset/2)+1,
                      ((CASE qs.statement_end_offset
            WHEN -1 THEN DATALENGTH(st.text)
            ELSE qs.statement_end_offset
            END - qs.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats AS qs
           CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS st
          ORDER BY total_worker_time/execution_count DESC;

Results

Wrapping Up 

To maintain a highly available system, the database needs to be responsive and reliable. Performance and reliability are key components of business processes. These goals are attainable by taking the time to become more aware of the built-in functions of SQL Server. These tools and views are designed to provide insight into some of the most challenging areas of SQL Server. Consider the latest release of SQL Server. Microsoft is making strides in handling all the nuances of data from within. Database Administrators and Data Engineers can certainly add dynamic views to their arsenal of tools used to manage the performance of the SQL environment.