Skip to main content

How to plot Pandas Dataframe as a Table in Python using Plotly?

Sometime its necessary to plot the pandas Dataframe as a table in python/Jupyter notebook, for instance for an easy reference to the Data dictionary. This is made easy by using plotly.

Step 1: Install Plolty 

pip install plotly

Step 2: Import Plotly

import plotly.graph_objects as go

Step 3:  Plot the table 

# read the data dictonary
data_dict = pd.read_csv("data_dictionary.csv")
fig = go.Figure(data=[go.Table(
    header=dict(values=list(data_dict.columns),
                fill_color='paleturquoise',
                align='left'),
    cells=dict(values=data_dict.transpose().values.tolist(),
               fill_color='lavender',
               align='left'))
])

fig.show()



Comments

Popular posts from this blog

FTP C# Error : “The remote server returned an error: (530) Not logged in ."

  Recently working on a FTP solution using C# , i encountered an error  “ The  remote server returned an error: (530) Not logged in.” The code i used was following: FtpWebRequest request = (FtpWebRequest)WebRequest.Create( ftp://xxxxxx/file.txt ); request.Method = WebRequestMethods.Ftp.UploadFile request.Credentials = new NetworkCredential(usernameVariable, passwordVariable); What was more bewildering was if i modified the code to following, the solution was working fine. But this for obvious reasons is not an option as the username cannot be hardcoded //works but implausible to use in realtime solutions request.Credentials = new NetworkCredential("dmn/#gsgs", password);  Some googling revealed that special charcters create issues in the NetworkCredential Object. Hence some playing around worked for me, and it works irrespective of wether i do a FTPWebRequest or WebRequest. Solution: Instantiate NetworkCredential object with three paramters (username, password, domain) and m

Llamhub SnowflakeReader: A Loader to query and chat Snowflake Data in your LLM Applications

  Snowflake Loader for LLM Recently my second contribution to Llamaindex "SnowflakeReader" was merged to Lllamahub repository. This loader connects to Snowflake (using SQLAlchemy under the hood). The user specifies a query and extracts Document objects corresponding to the results. This loader is designed to be used as a way to load data into  LlamaIndex  and/or subsequently used as a Tool in a  LangChain  Agent.  Usage Option 1: Pass your own SQLAlchemy Engine object of the database connection Here's an example usage of the SnowflakeReader. from llama_index import download_loader SnowflakeReader = download_loader ( 'SnowflakeReader' ) reader = SnowflakeReader ( engine = your_sqlalchemy_engine , ) query = "SELECT * FROM your_table" documents = reader . load_data ( query = query ) Option 2: Pass the required parameters to esstablish Snowflake connection Here's an example usage of the SnowflakeReader. from llama_index import down

BLAS: Basic Linear Algebra Subprograms | Navigating Through BLAS on Your Windows Machine

Basic Linear Algebra Subprograms (BLAS), a term I encountered w hile working with the LLAMA (LLM model from Meta) on my windows machine seems to directly affect the performance of the model's inference. Most probably you (like me) already have BLAS capabilities on your windows machine and may be missing on the chance to leverage its capabilities. Lets understand the steps to ensure you are aware of BLAS and its implementation on your machine. Introduction: Basic Linear Algebra Subprograms (BLAS) is the standard that outlines a collection of fundamental routines designed to execute prevalent linear algebra operations, including vector addition, scalar multiplication, dot products, linear combinations, and matrix multiplication. Universally recognized as the standard foundational routines for linear algebra libraries, BLAS routines offer bindings for both C, through the "CBLAS interface," and Fortran, known as the "BLAS interface." While the BLAS specification is