Shawn Zhong

Shawn Zhong

钟万祥
  • Tutorials
  • Mathematics
    • Math 240
    • Math 375
    • Math 431
    • Math 514
    • Math 521
    • Math 541
    • Math 632
    • Abstract Algebra
    • Linear Algebra
    • Category Theory
  • Computer Sciences
    • CS/ECE 252
    • CS/ECE 352
    • Learn Haskell
  • AP Notes
    • AP Microecon
    • AP Macroecon
    • AP Statistics
    • AP Chemistry
    • AP Physics E&M
    • AP Physics Mech
    • CLEP Psycho

Shawn Zhong

钟万祥
  • Tutorials
  • Mathematics
    • Math 240
    • Math 375
    • Math 431
    • Math 514
    • Math 521
    • Math 541
    • Math 632
    • Abstract Algebra
    • Linear Algebra
    • Category Theory
  • Computer Sciences
    • CS/ECE 252
    • CS/ECE 352
    • Learn Haskell
  • AP Notes
    • AP Microecon
    • AP Macroecon
    • AP Statistics
    • AP Chemistry
    • AP Physics E&M
    • AP Physics Mech
    • CLEP Psycho

Home / 2017 / October / Page 3

Math 375 – Homework 6

  • Oct 26, 2017
  • Shawn
  • Math 375
  • No comments yet
Read More >>

Math 375 – 10/12

  • Oct 31, 2017
  • Shawn
  • Math 375
  • No comments yet
Injective • Definition ○ If V,W are vector space and T:V→W is linear ○ Then T is injective if for all x,y∈V ○ Tx=Ty⇒x=y • Theorem ○ T:V→W is injective if and only if for all x∈V ○ Tx=0⇒x=0 ○ i.e. if and only if N(T)={0} • Proof ○ Suppose Tx=0⇒x=0 for all x∈V ○ Let x,y∈V be giben, and assume § Tx=Ty ○ Since T is linear, we have § T(x−y)=Tx−Ty=0 ○ Therefore § x−y=0 § ⇒x=y Null Space • Definition ○ If T:V→W is linear then ○ Null(T)=N(T)=kern(T)≝{x∈V│Tx=0} • Theorem ○ N(T) is a linear subspace of V • Proof: ○ To prove N(T)∈V is a linear subspace ○ We need to check closure properties i.e. § x,y∈N(T)⇒x+y∈N(T) § x∈N(T), c∈R⇒cx∈N(T) ○ Check closure under addition § Let x,y∈N(T), then Tx=0, Ty=0 § We have T(x+y)=Tx+Ty=0+0=0 § Therefore x+y∈N(T) ○ Check closure under scalar multiplication § Let x∈N(T), then Tx=0 § Let c∈R, then T(cx)=c⋅Tx=c⋅0=0 § Therefore cx∈N(T) ○ In conclusion, N(T)∈V is a linear subspace Range • Definition ○ If T:V→W is linear then ○ Range(T)=R(T)={Tx│x∈V} • Theorem ○ R(T) is a linear subspace of W Examples • Example 1 ○ Let V=W=R2, T(x,y)=(x,y) ○ Injective? § Given (x,y)∈R2, and (x ̅,y ̅ )∈R2 § with T(x,y)=T(x ̅,y ̅ ) § By definition of T § (x,y)=(x ̅,y ̅ ) § So T is injective ○ Null Space? § Because T is injective § N(T)={0,0} ○ Range? § R(T)≝{T(x,y)│(x,y)∈R2 }=R2 • Example 2 ○ Let V=W=R2, T(x,y)=(x,0) ○ Injective? § No § T(1,0)=T(1,1)=(1,0) ○ Null Space? § N(T)={u│Tu=0}={(0,t)│t∈R ○ Range? § R(T)={T(x,y)│(x,y)∈R2 } § ={(t,0)│t∈R2 } § =x\-axis • Example 3 ○ Let V=R3, W=R2, T(x,y,z)=(x,y) ○ Injective? § No § T(1,1,0)=T(1,1,1)=(1,1) ○ Null Space? § N(T)={(0,0,t)│t∈R ○ Range? § R(T)={T(x,y,z)│(x,y,z)∈R3 } § ={(x,y)│(x,y)∈R2 }=R2 • Example 4 ○ Let V=R2, W=R3, T(x,y)=(x,y,z) ○ T is injective ○ N(T)={0,0} ○ R(T)={(x,y,0)│(x,y)∈R2 }=xy\-plane • Summary T V W N(T) dim⁡N(T) R(T) dim⁡R(T) T(x,y)=(x,y) R2 R2 {0} 0 R2 2 T(x,y)=(x,0) R2 R2 y\-axis 1 x\-axis 1 T(x,y,z)=(x,y) R3 R2 z\-axis 1 R2 2 T(x,y)=(x,y,z) R2 R3 {0} 0 xy\-plane 2 Rank–Nullity Theorem • Statement ○ If T:V→W is linear and if V is finite dimensional ○ Then dim⁡N(T)+dim⁡R(T)=dim⁡V • Proof ○ Let § dim⁡N(T)=k § dim⁡V=n § {e_1,…,e_k } be a basis for N(T) ○ Claim § {e_1,…,e_k }⊆V is independent § ⇒There is a basis {e_1,…,e_k,e_(k+1),…,e_n } of V so dim⁡V=n § {Te_(k+1),Te_(k+2),…,Te_n } is a basis for R(T) ○ Prove {Te_(k+1),Te_(k+2),…,Te_n } is independent § Suppose □ c_(k+1) Te_(k+1)+…+c_n Te_n=0 § Then □ T(c_(k+1) e_(k+1)+…+c_n e_n )=0 □ ⇒c_(k+1) e_(k+1)+…+c_n e_n∈N(T) § Since {e_1,…,e_k } is a basis for N(T) □ c_(k+1) e_(k+1)+…+c_n e_n=c_1 e_1+…+c_k e_k □ −c_1 e_1−…−c_k e_k+c_(k+1) e_(k+1)+…+c_n e_n=0 § Since {e_1,…,e_n } is independent □ c_1=c_2=…=c_n=0 § In particular □ c_(k+1) Te_(k+1)+…+c_n Te_n=0 □ implies c_(k+1)=c_(k+2)=…=c_n=0 § Therefore □ {Te_(k+1),Te_(k+2),…,Te_n } is independent ○ Prove {Te_(k+1),Te_(k+2),…,Te_n } spans R(T) § Every y∈R(T) is of the form □ y=Tx □ For some x∈V § {e_1,…,e_n } is a basis for V, so □ x=x_1 e_1+x_2 e_2+…+x_n e_n □ For some x_1,x_2,…,x_n∈R § Therefore □ y=Tx □ =T(x_1 e_1+x_2 e_2+…+x_n e_n ) □ =x_1 Te_1+…+x_k Te_k+x_(k+1) Te_(k+1)+…+x_n Te_n □ =x_(k+1) Te_(k+1)+…+x_n Te_n∈span{Te_(k+1),Te_(k+2),…,Te_n } ○ Conclusion § dim⁡〖R(T)〗=n−k=dim⁡V−dim⁡N(T) § ⇒dim⁡N(T)+dim⁡R(T)=dim⁡V dad try (X,y)=(X,0) ZX a cyst) (x.y) • _ _ - _ _ TX J PS TT ox, y) → by a, oaky) his
Read More >>

Math 375 – 10/11

  • Oct 26, 2017
  • Shawn
  • Math 375
  • No comments yet
Question • Given ○ V=C([−1,1]) ○ ⟨v,w⟩=∫_(−1)^1▒v(x)w(x)dx • Find the linear polynomial closest to f(x)=e^x • Answer ○ Let S=span{1,x} ○ Projection of f onto S is ○ ⟨1,e^x ⟩/⟨1,1⟩ ⋅1+⟨x,e^x ⟩/⟨x,x⟩ ⋅x ○ Therefore the linear polynomial closest to f(x)=e^x is ○ g(x)=3/e x+(e−e^(−1))/2
Read More >>

Math 375 – 10/10

  • Oct 26, 2017
  • Shawn
  • Math 375
  • No comments yet
Linear Transformations • Definition ○ Let V and W be two vector spaces ○ Then a map/function/transformation/mapping ○ T:V→W is called linear if ○ {■8(T(x+y)=T(x)+T(y)&∀x,y∈V@T(c⋅x)=c⋅T(x)&∀x∈V,c∈R┤ • Mapping notation ○ In the mapping T:V→W ○ V is called domain ○ W is called codomain or target set ○ T(v) must be defined ∀v∈V ○ T(v) always belongs to W • Example 1 ○ Let V,W be any vector space ○ Define Tx=0, ∀x∈V ○ {█(T(x+y)=0@T(x)+T(y)=0+0=0)┤⇒T(x+y)=Tx+Ty ○ {█(T(c⋅x)=0@c⋅T(x)=c⋅0=0)┤⇒T(c⋅x)=c⋅T(x) ○ Therefore this mapping is a linear transformation • Example 2 ○ Let V,W be any vector space ○ Define Tv=w≠0, ∀v∈V ○ T(x)+T(y)=2w≠w=T(x+y) ○ Therefore this mapping is not a linear transformation • Example 3 ○ Let V=W be the same vector space ○ Define Tx=x, ∀v∈V ○ Then T is a linear transformation ○ T is called the identity map from V to V ○ Common notations: id, id_V, 1_V • Example 4 ○ Let V=W=R2 be the same vector space ○ Define T(x,y)=(2x,2y) ○ T(u)+T(v)=2u+2v=2(u+v)=T(u+v) ○ T(c⋅u)=2c⋅u=c⋅(2u)=c⋅T(u) ○ Therefore T is a linear transformation • Example 5 ○ Let V=W=R2 be the same vector space ○ Define T(a,b)=(b,a) ○ It s reflection in the diagonal • Example 6 ○ Let V=W=R2 be the same vector space ○ Define Tu=u rotated by 30° counter-clockwise ○ Proof by graph T(u+v)=T(u)+T(v) ○ We can also prove that T(c⋅v)=c⋅T(v) ○ Therefore T is a linear transformation Linear Transformation on Basis • Theorem ○ Suppose T:V→W is a linear transformation ○ Let {e_1,…,e_n } be a basis for V ○ Then T is completely defermined by ○ {Te_1,,Te_2…,Te_n } ○ Suppose we known Te_1,Te_2,…,Te_n, ○ and let x∈V be given ○ Then there are c_1,c_2,…,c_n∈R ○ such that x=c_1 e_1+c_2 e_2+…+c_n e_n, then ○ T(x)=T(c_1 e_1+c_2 e_2+…+c_n e_n ) ○ =T(c_1 e_1 )+T(c_2 e_2 )+…+T(c_n e_n ) ○ =c_1 Te_1+c_2 Te_2+…c_n Te_n • Example (Rotation) ○ Let V=W=R2 be the same vector space ○ Define T rotate by θ counter-clockwise ○ Pick a basis {e_1,e_2 }, where § e_1=(█(1@0)) § e_2=(█(0@1)) ○ Compute Te_1, Te_2 § Te_1=(█(cos⁡θ@sin⁡θ )) § Te_2=(█(−sin⁡θ@cos⁡θ )) ○ Compute T(ae_1+be_2 ) § T(ae_1+be_2 ) § =aTe_1+bTe_2 § =a(█(cos⁡θ@sin⁡θ ))+b(█(−sin⁡θ@cos⁡θ )) § =(█(a cos⁡θ−b sin⁡θ@a sin⁡θ+b cos⁡θ )) Solving System of Equations • Setup ○ {█(a_11 x_1+a_12 x_2+…+a_1n x_n=y_1@a_21 x_1+a_22 x_2+…+a_2n x_n=y_2@⋮@a_n1 x_1+a_n2 x_2+…+a_nn x_n=y_n )┤ ○ Define a transformation T: Rn→Rn ○ Let x=(█(x_1@⋮@x_n )), y=(█(y_1@⋮@y_n )) ○ Then Tx=y is a linear transformation • Property of one-to-one map ○ A linear map T:V→W is a one-to-one map ○ if for all u,v∈V ○ Tu=Tv⇒u=v ○ i.e. The equation Tx=y has at most one solution • Example of one-to-one map ○ Let V=R2, W=R3 ○ T(x_1,x_2 )=(x_1,x_2,0)=(y_1,y_2,y_3 ) ○ {█(1x_1+0x_2=y_1@0x_1+1x_2=y_2@0x_1+0x_2=y_3 )┤⇒{█(y_1=x_1@y_2=x_2@y_3=0)┤ ○ Three equations, two unknowns • Theorem ○ A linear map T:V→W is injective ○ if for all x∈V ○ Tx=0⇒x=0
Read More >>

Math 375 – 10/9

  • Oct 26, 2017
  • Shawn
  • Math 375
  • No comments yet
Question • Let V be a finite-dimensional inner product space • S⊆V is a subspace of V • Let S^⊥={v∈V│∀s∈S,⟨v,s⟩=0} Prove (S^⊥ )^⊥=S Answer: First, (S⊥)⊥ is the orthogonal complement of S ⊥, which is itself the orthogonal complement of S, so (S⊥)⊥ = S means that S is the orthogonal complement of its orthogonal complement. To show that it is true, we want to show that S is contained in (S⊥)⊥ and, conversely, that (S⊥)⊥ is contained in S; if we can show both containments, then the only possible conclusion is that (S⊥)⊥ = S. To show the first containment, suppose v ∈ S and w ∈ S ⊥. Then hv, wi = 0 1 by the definition of S ⊥. Thus, S is certainly contained in (S⊥)⊥ (which consists of all vectors in R n which are orthogonal to S ⊥) To show the other containment, suppose v ∈ (S⊥)⊥ (meaning that v is orthogonal to all vectors in S ⊥) then we want to show that v ∈ S. I’m sure there must be a better way to see this, but here’s one that works. Let {u1, . . . , up} be a basis for S and let {w1, . . . , wq} be a basis for S ⊥. If v ∈/ S, then {u1, . . . , up, v} is a linearly independent set. Since each vector in that set is orthogonal to all of S ⊥, the set {u1, . . . , up, v, w1, . . . , wq} is linearly independent. Since there are p+q+1 vectors in this set, this means that p+q+1 ≤ n or, equivalently, p + q ≤ n − 1. On the other hand, if A is the matrix whose ith row is u T i , then the row space of A is S and the nullspace of A is S ⊥. Since S is p-dimensional, the rank of A is p, meaning that the dimension of nul(A) = S ⊥ is q = n − p. Therefore, p + q = p + (n − p) = n, contradicting the fact that p + q ≤ n − 1. From this contradiction, then, we see that, if v ∈ (S⊥)⊥, it must be the case that v ∈ S
Read More >>
  • 1
  • 2
  • 3
  • 4
  • 5
  • …
  • 8

Search

  • Home Page
  • Tutorials
  • Mathematics
    • Math 240 – Discrete Math
    • Math 375 – Linear Algebra
    • Math 431 – Intro to Probability
    • Math 514 – Numerical Analysis
    • Math 521 – Analysis I
    • Math 541 – Abstract Algebra
    • Math 632 – Stochastic Processes
    • Abstract Algebra @ 万门大学
    • Linear Algebra @ 万门大学
    • Category Theory
  • Computer Sciences
    • CS/ECE 252 – Intro to Computer Engr.
    • CS/ECE 352 – Digital System Fund.
    • Learn Haskell
  • Course Notes
    • AP Macroeconomics
    • AP Microeconomics
    • AP Chemistry
    • AP Statistics
    • AP Physics C: E&M
    • AP Physics C: Mechanics
    • CLEP Psychology
  • 2048 Game
  • HiMCM 2016
  • 登峰杯 MCM

WeChat Account

Categories

  • Notes (418)
    • AP (115)
      • AP Macroeconomics (20)
      • AP Microeconomics (23)
      • AP Physics C E&M (25)
      • AP Physics C Mechanics (28)
      • AP Statistics (19)
    • Computer Sciences (2)
    • Mathematics (300)
      • Abstract Algebra (29)
      • Category Theory (7)
      • Linear Algebra (29)
      • Math 240 (42)
      • Math 375 (71)
      • Math 514 (18)
      • Math 521 (39)
      • Math 541 (39)
      • Math 632 (26)
  • Projects (2)
  • Tutorials (11)

Archives

  • October 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • July 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017

WeChat Account

Links

RobeZH's thoughts on Algorithms - Ziyi Zhang
Copyright © 2018.      
TOP