Last time we explored working with plotting and also working with the basic math-and-array library for python.
Today we are going to get a little more abstract with two libraries that manage advance math. This will also empower us to be able to attack Roots and Matrices.
These two libraries are
SciPy : A set of resources for linear algebra, statistics, and differentiation/integration. SymPy : A symbolic computer algebra system (which does a few things that we’ve done with Mathcad.
We’ll start by exploring SymPy and then move forward with some tricks in SciPy. This will then have us ready to start attacking some of our past homework assignments the “python” way.
Let's start by loading up a set of libraries that we'll be working with today.
Along with the above two libraries we'll be digging out NumPy and Matplotlib.
Let's call our four libraries using the handle system to "grab" them when we run our program.
##########################################################
#
# Library Calls.
#
import numpy as np
import matplotlib.pyplot as plt
# loading sympy
import sympy as sym
# For SciPy, there are some modules that you will need to
# explicitly pull (rather like the pyplot area
# of matplotlib)
import scipy as sci # basic scipy
import scipy.integrate as sciint # integration tools
import scipy.optimize as sciopt # optimization tools (for roots)
import scipy.linalg as scilina # linear algebra tools
import scipy.misc as scimisc # some other tools (includes derivatives)
#
##########################################################
Variables in symbolic math solvers be it with Python or also with MATLAB are handled differently than an ordinary variable or array.
To play in SymPy, we create a variable (or symbol). This will be independant of the numpy library we worked with last session as will alot of things we'll be doing in this part of today's exercise.
To access the the function to create a variable, just as we did with the numpy library, we invoke the library and then the symbols() function. The documentation for it is here: sympy.symbols.
################################################################
#
# Let's make a x and y for use in symbolic solutions
#
x = sym.symbols('x')
y = sym.symbols('y')
#
################################################################
Now let's test this with something simple: Let's do a very simple derivative using the sympy.diff function:
$$ \frac{d}{dx}\left(x^2 - 3 x + 2 \right) $$################################################################
#
# Demonstrating how differentiate
# a very simple math function
#
sym.diff(x**2 - 3 * x + 2, # we write the function....
x) # and we're differentiating by "x"
#
################################################################
Ooooo... Pretty. If you look closely the font is different than the normal python output. That's acutally HTML in Jupyter. (It's not as pretty if you have to work in a boring terminal. Indeed the results can be so ugly that I avoided SymPy before working with Jupyter Notebooks.
OK now let's try this one (more relevant to what we are doing above with pi!)...
$$ \frac{d}{dx}\left( 4 \arctan{x} \right) $$(Note that we will need to use SymPy's version of any math function that you'd otherwise use from the NumPy python library where you'd normally do your number-crunching.)
################################################################
#
# Demonstrating how differentiate
# using a basic math function
#
sym.diff( 4 * sym.atan(x), x)
#
################################################################
still pretty.
Bet you can guess what we do next:
We can symbolically integrate as well... (We'll use SciPy for this shortly.) Such as the one from our pi example. The function for symbolic integration, while accessable from the "root" sympy library area is sympy.integrals.integrals.integrate
$$ \int {\frac{4}{1 + x^2}}dx $$################################################################
#
# Demonstrating how integrate a function
#
sym.integrate( 4 / (1 + x*x), # our function
x) # the value overwhich we'll integrate
#
################################################################
And we can request specific ranges for our integral!
$$ \int_{0}^{1} {\frac{4}{1 + x^2}}dx $$################################################################
#
# Demonstrating how integrate a function
# over a range (this time the last one is inclusive)
#
sym.integrate( 4 / (1 + x*x), # our function
(x, 0, 1) ) # the value and limits over
# which we'll integrate
#
################################################################
(Honestly the first time I did this on Jupyter, I was expecting the equivalent of np.pi!)
We'll play rough with roots with SciPy shortly as well... but let's do a little Sympi fun first.
Let's try a basic quadradic equation! Here we can pretend that we've already slammed all of our terms on one side of the equal sign.
$$ f(x) = x^2 - 3 x + 2 $$(This is one the school board would allow Mrs Mercer to give you in Middle School and it's easy to factor without fighting with or having to remember the quadradic equation song.)
But remember... what do we do first?
Yup, we plot it!
SymPy has a separate plotting library using elements from matplotlib comes in handy here... You can also zero in on your x-range in a way that is similar to the above calculus integration method! The default, like Mathcad, is to plot x from -10 to +10.
################################################################
#
# Showing how to plot a function using
# the SymPy package (and how to plot it
# over a range where x is bounded between
# 0 and 3).
#
sym.plotting.plot(x**2 - 3*x + 2, # here is our function
(x, 0, 3)) # and our x axis
# is set from 0 to 3)
#
################################################################
For a more complicated problem you could now fetch a couple values on the x axis for first guesses of your root!
But let's try this function below to solve the system using sympy.solve.
################################################################
#
# Demonstrating the Sym.Solve Function
#
sym.solve(x**2 - 3*x + 2, # our function
x) # solve for x. (stick a pin in this one!)
#
################################################################
And we can have a little more fun (if you think flashbacks back to Mrs Mercer's Algebra class is "fun.") with sympy.factor
################################################################
#
# Showing-off the Factoring Function
#
sym.factor(x**2 - 3*x + 2, # our function
x) # solve for x
#
################################################################
We can also do a bounded root solution (For a simple nonlinear equation, it's defaulting to the secant method. It can also manage oversolved / overdetermined systems.). Here we use sympy.nsolve.
################################################################
#
# Showing-off the Nsolve Function
#
sym.nsolve((x**2 - 3*x + 2), # our function
x, # solve for x
[0, 1.5]) # between these bounds
#
################################################################
################################################################
#
# Showing-off the Nsolve Function WITH verbose output
#
sym.nsolve((x**2 - 3*x + 2), # our function
x, # solve for x
[0, 1.5], # between these bounds
verbose = True) # and show us the details of our proceedures!
#
################################################################
################################################################
#
# MORE Showing-off the Nsolve Function WITH verbose output
#
sym.nsolve((x**2 - 3*x + 2), # our function
x, # solve for x
[0, 1.5], # between these bounds
verbose = True) # and show us the details of our proceedures!
#
################################################################
################################################################
#
# EVEN MORE Showing-off the Nsolve Function WITH STILL MORE
# verbose output
#
sym.nsolve((x**2 - 3*x + 2), # our function
x, # solve for x
[0.50, 0.75], # between these bounds
solver='secant', # choose your method
verbose = True) # and show us the details of our proceedures!
#
################################################################
Let's now leverage symbolic solutions to do some simple simultaneous equations.
Consider our standard linaer equation pair.
$$f(x,y) = 2x-y+2$$$$g(x,y) = 2x+y-6$$recall from our earlier work that $$x = 1$$ $$y = 4$$
We'll explore a few elements of this that demonstrate the SymPy features
First we'll need a symbolic symbol for y:
################################################################
#
# Solve the two equations symbolically using the same tool
#
sym.solve([2*x - y + 2, # our f(x,y) function
2*x + y - 6], # our g(x,y) function
x,y) # solve for x. (stick a pin in this one!)
#
################################################################
We can also prep our equations for plotting
################################################################
#
# Solve the two equations symbolically using the same tool
#
yf = sym.solve(2*x - y + 2, # our g(x,y) function
y) # solve for x. (stick a pin in this one!)
yg = sym.solve( 2*x + y - 6, # our g(x,y) function
y) # solve for x. (stick a pin in this one!)
print(yf)
print(yg)
#
################################################################
and we can plot!
################################################################
#
# Now let's plot two equations at once.
#
sym.plotting.plot(( 2*x + 2), # yf(x)
(-2*x + 6), # yg(x)
(x,-1,3)) # zoom in on the x axis
#
################################################################
and now let's do some of these steps in our other library to explore in this session: SciPy
We'll mirror some of our earier work from above
Let's start with our arctangent-pi play from above.
$ f(x) = \frac{4}{1+x^2} $
Here is how we express this in a function in Python. (You can also do this with pi from our previous sessions.)
################################################################
#
# let's make a "function" in python
#
def f(x) :
return(4 / (1 + x**2))
#
################################################################
... and plot it out (As you get braver and more confortable with plotting in python there are tools that make nice and fancy plots, but for now let's keep it basic but with a few simple to make embelishments from last time using horizontal and vertical lines through our x=0 and y=0 axes with matplotlib.pyplot.axhline and matplotlib.pyplot.axvline
################################################################
#
# testing our function
#
print("f(x=0) = ", f(0))
print("f(x=1) = ", f(1))
print("f(x=2) = ", f(2))
#
# replotting the function with matplotlib
#
X = np.arange(-4,4,0.01)
plt.plot(X, f(X)) # plot x vs f(x)
plt.xlabel("x") # label your graphs or your
plt.ylabel("f(x)") # TA will yell at you
plt.ylim(-1,6) # you can change your axis limits
plt.xlim(-4,4) # (it'll take a little trial and error)
#
# (with those axes set you can toss in a pair of
# reference lines for your true x and y axes)
#
plt.axhline( y = 0, # through the origin
linewidth = 0.5, # skinny
color = 'black') # "paint it black"
plt.axvline( x = 0, # through the origin
linewidth = 0.5, # skinny
color = 'black') # "paint it black"
#
################################################################
What's that derivative where x = 1? This will be a quantitative, not symbolic, calculation dervived at a given value on the "number line" using scipy.misc.derivative
################################################################
#
# a simple derivative evaluated at a given value
#
dfdx_of_1 = scimisc.derivative(f, 1.0)
print("f'(1) = ", dfdx_of_1)
#
################################################################
And let's say that our uncertainty of x ($\Delta x$) is 0.01. Recalling how error propagates.
$$\Delta f = \frac{df}{dx}\Delta x$$so...
################################################################
#
# a simple derivative
#
uncertainty_of_x = 0.01
uncertainty_of_f_at_1 = scimisc.derivative(f, 1.0) * uncertainty_of_x
uncertainty_of_f_at_0 = scimisc.derivative(f, 0.0) * uncertainty_of_x
uncertainty_of_f_at_5 = scimisc.derivative(f, 5.0) * uncertainty_of_x
print("Δf(1) = ", uncertainty_of_f_at_1 )
print("Δf(0) = ", uncertainty_of_f_at_0 )
print("Δf(5) = ", uncertainty_of_f_at_5 )
#
################################################################
We can also get the area under the curve with a set of integration functions.
The most basic function for integrating that has worked well for me is scipy.integrate.quad (short for the quadrature method of estimating integrals).
Let's integrate over our test-case function from 0 to 1 which as you recall should be $\pi$
################################################################
#
# a simple integration
#
integral_of_f_from_0_to_1 = sciint.quad(f, # integrate function f
0, # ... from 0
1) # ... to 1
print("F(0,1) = ", integral_of_f_from_0_to_1)
#
# (that trailing value in the result is the absolute error of
# the estimated integral - as you can see this one is effectively
# zero)
#
print("result without error",integral_of_f_from_0_to_1[0])
#
################################################################
################################################################
#
# let's make another function in python
#
def g(x) :
return(x**2 - 3*x + 2)
#
# replotting the function with matplotlib
#
X = np.arange(-1,4,0.01)
plt.plot(X, g(X)) # plot x vs f(x)
plt.xlabel("x") # label your graphs or your
plt.ylabel("g(x)") # TA will yell at you
plt.ylim(-1,6) # you can change your axis limits
plt.xlim(-1,4) # (it'll take a little trial and error)
plt.axhline( y = 0, # through the origin
linewidth = 0.5, # skinny
color = 'black') # "paint it black"
plt.axvline( x = 0, # through the origin
linewidth = 0.5, # skinny
color = 'black') # "paint it black"
#
################################################################
There are a number of root-finding functions (you have to go below the optimization routines on the above link to the area discussing "root finding"). You will recognize functions for bisection, newton and sectant methods.... For False Position Brent's and Ridder's methods are "improvements" to the original classic false position scheme.
################################################################
#
# Playing with a Newton Raphson Solver.
#
our_low_root = sciopt.newton(g,-1)
print("low x = ",our_low_root)
# another way to call it.
our_high_root = sciopt.newton(g,10)
print("high x = ",our_high_root)
#
################################################################
################################################################
#
# Playing with a Secant Solver.
#
our_low_root = sciopt.root_scalar( g, # func
x0 = 0.0, # guess
x1 = 0.5, # 2nd guess
method='secant') # method
print(" low x full results = ",our_low_root) # all results
print()
print(" low x value = ",our_low_root.root) # just the root
print()
print()
#
# Playing with a False Position (Brent's) Solver.
#
our_high_root = sciopt.root_scalar( g, # function
bracket=[1.5, 4], # this needs a bracket pair
method='brenth') # method (based on False-Position)
print("high x full results = ",our_high_root) # all results
print()
print(" high x value = ",our_high_root.root) # just the root
#
################################################################
Both NumPy and SciPy both have linear algebra funcitons. Scipy has some spiffier additions though so I use that when I do matrix-fu.
Let's use our above equations...
$$f(x,y) = 2x-y+2$$$$g(x,y) = 2x+y-6$$which creates a matrix system
$$\mathbf{A}=\begin{bmatrix} 2 & -1 \\ 2 & 1 \end{bmatrix} \vec{b}=\begin{bmatrix} -2\\ 6 \end{bmatrix}$$################################################################
#
# Using NumPy to make the array
#
A = np.array([[ 2, -1],
[ 2, 1]])
b = np.array([[-2],
[6]])
# you can also make b like this: np.array([-2,6])
# frequently a 1-d array will be automatically treated
# like a "vertical" vector in some languates
#
################################################################
With this we can do some basic Matrix Operations...
################################################################
#
# Calculate Determinant and Inverse
#
det_of_A = scilina.det(A)
inv_of_A = scilina.inv(A)
print(" |A| = ", det_of_A)
print("inv(A) = ", inv_of_A)
#
################################################################
Finally we have the ability to solve matrix systems without an intermediate matix inverse step.
################################################################
#
# Calculate Determinant and Inverse
#
AinvB = scilina.solve(A, b)
print(" x = ", AinvB)
#
################################################################
There are a LOT of other resources available in SciPy. Nobody could (or should) try to dig into them all at one setting. There are also resources for statistics and specialized physics problems but, I will save the former for another session where we replicate your previous assignments.
################################################################
#
# Loading Version Information
#
%load_ext version_information
%version_information version_information, numpy, sympy, scipy, matplotlib
#
################################################################