Theano의 기본 object, operation에 대해 이제 익숙하신가요? 아니라면! Basic Tensor Functionality 복습!!
fucntion을 elementwise 하게 compute한다 = matrix의 각 element에 function을 각각 적용한다는 의미.
import theano.tensor as T
from theano import function
x = T.dmatrix('x')
s = 1/(1 + T.exp(-x)) # logistic function
logistic = function([x],s)
logistic([[0,1],[-1,-2]])
array([[ 0.5 , 0.73105858], [ 0.26894142, 0.11920292]])
logistic이 elementwise하게 수행되는 이유는 logistic함수의 모든 operation(division, addition, exponentiation)이 elementwise operation이기 때문이다.
다음의 경우도 마찬가지이다
두 식이 같은 값을 내는지 확인해봅시다~
s2 = (1+T.tanh(x/2))/2 # alternative logistic function
logistic2 = function([x],s2)
logistic2([[0,1],[-1,-2]])
array([[ 0.5 , 0.73105858], [ 0.26894142, 0.11920292]])
# produce the same values~!!
theano는 multiple output을 가지는 function을 지원합니다.
a,b = T.dmatrices('a','b')
diff = a-b # (1)
abs_diff = abs(diff) # (2)
diff_squared = diff*2 # (3)
f = function([a,b],[diff,abs_diff, diff_squared])
dmatrices produces as many outputs as names that you provide.
It is a shortcut for allocating symbolic variables that we will often use in the tutorials.
f([[1, 1], [1, 1]], [[0, 1], [2, 3]])
[array([[ 1., 0.], [-1., -2.]]), array([[ 1., 0.], [ 1., 2.]]), array([[ 2., 0.], [-2., -4.]])]
두 개의 숫자를 더하는 function을 정의해봅시다.
단, 하나의 숫자만 제공할 경우는, 다른 input은 1로 가정합니다.
from theano import Param
x, y = T.dscalars('x', 'y')
z = x + y
f = function([x, Param(y, default=1)], z)
f(33) # 한 개의 input만 제공하였으므로 두번째 input은 1로 가정하여 33 + 1
array(34.0)
f(33, 2) # 33 + 2
array(35.0)
Param class
we give a default value of 1 for y by creating a Param instance with its default field set to 1.
Inputs with default values must follow inputs without default values (like Python’s functions)
default value 없는 input 다음에 default 값을 가지고 있는 input이 놓인다. (like Python’s functions)
function([x, Param(y, default=1)], z)
x, y, w = T.dscalars('x', 'y', 'w')
z = (x + y) * w
f = function([x, Param(y, default=1), Param(w, default=2, name='w_by_name')], z)
f(33) # (33 + 1)* 2
array(68.0)
f(33, 2) # (33 + 2) * 2
array(70.0)
f(33, 0, 1) # (33 + 0) * 1
array(33.0)
f(33, w_by_name=1) # (33 + 1) * 1
array(34.0)
f(33, w_by_name=1, y=0) # (33 + 0) * 1
array(33.0)
Param은 local variables y, w의 이름을 알지 못하고 argument로 pass 합니다. symbolic variable object는 name attributes를 가지고 이것이 우리가 만드는 function에서 keyword paramenter의 이름입니다.
Param(y, default=1)
<theano.compile.pfunc.Param at 0x16644ef0>
Param(w, default=2, name='w_by_name')
<theano.compile.pfunc.Param at 0x16644a90>
You may like to see Function in the library for more detail.
function with an internal state
accumulator( 누산기)를 만들어봅시다~.
처음에 0으로 state 초기화하고, 각 function call시, state는 function argument에 의해 증가합니다.
from theano import shared
state = shared(0)
inc = T.iscalar('inc')
accumulator = function([inc], state, updates=[(state, state+inc)])
The shared function constructs so-called shared variables
These are hybrid symbolic and non-symbolic variables : multiple function간에 값을 공유한다.
shared variable은 dmatrices() 에 의해 return되는 object와 같은 symbolic expression에서 사용 될 수 있지만, internal value 도 갖는다.
updates는
의 형태를 가진다.
function 실행할 때 마다 각 shared variable의 .value를 corresponding expression의 결과로 대체한다.
위에서 만들었던 accumulator 함수는 state의 value를 state의 합계와 증가분으로 대체할 것이다. 확인해보자
state.get_value()
array(0)
accumulator(1)
array(0)
state.get_value()
array(1)
accumulator(300)
array(1)
state.get_value()
array(301)
state를 reset한다.
state.set_value(-1)
accumulator(3)
array(-1)
state.get_value()
array(2)
위에서 이야기했던 것 처럼, 같은 shared variable을 사용하는 하나이상의 function을 정의할 수 있다. 이들 function은 모든 값을 update한다.
decrementor = function([inc], state, updates=[(state, state-inc)])
decrementor(2)
array(2)
state.get_value()
array(0)
왜 updates mechnism이 존재하는 걸까요? new expression에 의해 비슷한 결과를 얻을 수 있고 평소대로 numpy에서 그것들로 작업할 수 있습니다. updates mechanisms can be a syntactic convenience일 수 있지만, 주로 효율성때문에 쓰입니다. shared variable의 update는 때때로 in-place algorithm을 사용하는 것 보다 빠릅니다!
또한 theano는 shared variable들이 어디에 어떻게 할당되었는지에 대하여 control을 가집니다. 이는 GPU에서 좋은 성능을 내는데에 중요합니다.
shared variable을 사용하여 formaula를 표현하지만, 그 값은 사용하고 싶지 않을때, function의 givens parameter를 사용할 수 있으며, 이는 그래프에서 특정한 node를 대신한다.
fn_of_state = state * 2 + inc
# The type of foo must match the shared variable we are replacing with the ``givens``
foo = T.scalar(dtype=state.dtype)
skip_shared = function([inc, foo], fn_of_state,
givens=[(state, foo)])
skip_shared(1, 3) # we're using 3 for the state, not state.value
array(7)
state.get_value() # old state still there, but we didn't use it
array(0)
The givens parameter can be used to replace any symbolic variable, not just a shared variable. You can replace constants, and expressions, in general. Be careful though, not to allow the expressions introduced by a givens substitution to be co-dependent, the order of substitution is not defined, so the substitutions have to work in any order.
givens를 이용해서, formula를 같은 shape과 dtype의 tensor를 사용하는 다른 expression으로 바꿀 수 있다.
Theano shared variable broadcast pattern default to False for each dimensions. Shared variable size can change over time, so we can’t use the shape to find the broadcastable pattern. If you want a different pattern, just pass it as a parameter theano.shared(..., broadcastable=(True, False))
The way to think about putting randomness into Theano’s computations is to put random variables in your graph. Theano will allocate a NumPy RandomStream object (a random number generator) for each such variable, and draw from it as necessary. We will call this sort of sequence of random numbers a random stream. Random streams are at their core shared variables, so the observations on shared variables hold here as well. Theanos’s random objects are defined and implemented in RandomStreams and, at a lower level, in RandomStreamsBase.
NumPy RandomStream object
a random number generator
random stream : sequence of random number
from theano.tensor.shared_randomstreams import RandomStreams
from theano import function
srng = RandomStreams(seed=234)
rv_u = srng.uniform((2,2))
rv_n = srng.normal((2,2))
f = function([], rv_u)
g = function([], rv_n, no_default_updates=True) #Not updating rv_n.rng
nearly_zeros = function([], rv_u + rv_u - 2 * rv_u)
Here, ‘rv_u’ represents a random stream of 2x2 matrices of draws from a uniform distribution. Likewise, ‘rv_n’ represents a random stream of 2x2 matrices of draws from a normal distribution. The distributions that are implemented are defined in RandomStreams and, at a lower level, in raw_random. They only work on CPU. See Other Implementations for GPU version.
Now let’s use these objects. If we call f(), we get random uniform numbers. The internal state of the random number generator is automatically updated, so we get different random numbers every time.
f_val0 = f()
f_val0
array([[ 0.44078224, 0.26993381], [ 0.14317277, 0.43571539]])
f_val1 = f() #different numbers from f_val0
f_val1
array([[ 0.86342685, 0.81031029], [ 0.86695784, 0.6813093 ]])
g_val0 = g() # different numbers from f_val0 and f_val1
g_val0
array([[ 0.37328447, -0.65746672], [-0.36302373, -0.97484625]])
g_val1 = g() # same numbers as g_val0!
g_val1
array([[ 0.37328447, -0.65746672], [-0.36302373, -0.97484625]])
An important remark is that a random variable is drawn at most once during any single function execution. So the nearly_zeros function is guaranteed to return approximately 0 (except for rounding error) even though the rv_u random variable appears three times in the output expression.
nearly_zeros = function([], rv_u + rv_u - 2 * rv_u)
Random variables can be seeded individually or collectively. You can seed just one random variable by seeding or assigning to the .rng attribute, using .rng.set_value().
rng_val = rv_u.rng.get_value(borrow=True) # Get the rng for rv_u
rng_val.seed(89234) # seeds the generator
rv_u.rng.set_value(rng_val, borrow=True) # Assign back seeded rng
You can also seed all of the random variables allocated by a RandomStreams object by that object’s seed method. This seed will be used to seed a temporary random number generator, that will in turn generate seeds for each of the random variables.
srng.seed(902340) # seeds rv_u and rv_n with different seeds each
As usual for shared variables, the random number generators used for random variables are common between functions. So our nearly_zeros function will update the state of the generators used in function f above.
state_after_v0 = rv_u.rng.get_value().get_state()
nearly_zeros() # this affects rv_u's generator
array([[ 0., 0.], [ 0., 0.]])
v1 = f()
v1
array([[ 0.23219826, 0.25305996], [ 0.02116774, 0.65845077]])
rng = rv_u.rng.get_value(borrow=True)
rng.set_state(state_after_v0)
rv_u.rng.set_value(rng, borrow=True)
v2 = f() # v2 != v1
v2
array([[ 0.62720432, 0.90458979], [ 0.14363919, 0.89279932]])
v3 = f() # v3 == v1
v3
array([[ 0.23219826, 0.25305996], [ 0.02116774, 0.65845077]])
In some use cases, a user might want to transfer the “state” of all random number generators associated with a given theano graph (e.g. g1, with compiled function f1 below) to a second graph (e.g. g2, with function f2). This might arise for example if you are trying to initialize the state of a model, from the parameters of a pickled version of a previous model. For theano.tensor.shared_randomstreams.RandomStreams and theano.sandbox.rng_mrg.MRG_RandomStreams this can be achieved by copying elements of the state_updates parameter.
Each time a random variable is drawn from a RandomStreams object, a tuple is added to the state_updates list. The first element is a shared variable, which represents the state of the random number generator associated with this particular variable, while the second represents the theano graph corresponding to the random number generation process (i.e. RandomFunction{uniform}.0).
An example of how “random states” can be transferred from one theano function to another is shown below.
import theano
import numpy
import theano.tensor as T
from theano.sandbox.rng_mrg import MRG_RandomStreams
from theano.tensor.shared_randomstreams import RandomStreams
class Graph():
def __init__(self, seed=123):
self.rng = RandomStreams(seed)
self.y = self.rng.uniform(size=(1,))
g1 = Graph(seed=123)
f1 = theano.function([], g1.y)
g2 = Graph(seed=987)
f2 = theano.function([], g2.y)
print 'By default, the two functions are out of sync.'
print 'f1() returns ', f1()
print 'f2() returns ', f2()
By default, the two functions are out of sync. f1() returns [ 0.72803009] f2() returns [ 0.55056769]
def copy_random_state(g1, g2):
if isinstance(g1.rng, MRG_RandomStreams):
g2.rng.rstate = g1.rng.rstate
for (su1, su2) in zip(g1.rng.state_updates, g2.rng.state_updates):
su2[0].set_value(su1[0].get_value())
print 'We now copy the state of the theano random number generators.'
copy_random_state(g1, g2)
print 'f1() returns ', f1()
print 'f2() returns ', f2()
We now copy the state of the theano random number generators. f1() returns [ 0.59044123] f2() returns [ 0.59044123]
other distributions implemented
There is 2 other implementations based on MRG31k3p and CURAND. The RandomStream only work on the CPU, MRG31k3p work on the CPU and GPU. CURAND only work on the GPU.
To use you the MRG version easily, you can just change the import to:
from theano.sandbox.rng_mrg import MRG_RandomStreams as RandomStreams
import numpy
import theano
import theano.tensor as T
rng = numpy.random
N = 400
feats = 784
D = (rng.randn(N, feats), rng.randint(size=N, low=0, high=2))
training_steps = 10000
# Declare Theano symbolic variables
x = T.matrix("x")
y = T.vector("y")
w = theano.shared(rng.randn(feats), name="w")
b = theano.shared(0., name="b")
print "Initial model:"
print w.get_value(), b.get_value()
Initial model: [ 2.97739684 0.60328309 0.18612802 -0.17431635 1.81582416 -0.7556303 0.05762588 0.74761964 0.5758447 -0.99091574 0.14852788 0.89403287 -0.41224442 -0.15732817 -0.74100024 -2.44914531 0.9619713 0.26468002 0.45709349 0.27335298 -1.11268119 0.69512412 -0.81561863 -0.57028575 0.61096082 1.03591175 0.69089474 0.89338241 0.09597136 -1.06471582 0.60605161 0.53348035 1.0884427 -0.25794321 -2.31603685 -0.64299083 -1.79738322 1.72116934 -0.12311971 1.71783085 -0.47786566 -0.17668474 -0.34049941 -0.40684382 0.04639455 1.31671869 -0.61527003 -0.50243778 -0.2934254 -1.59780094 -0.7067815 -0.68389618 0.55261793 1.41730855 0.73349963 0.66580206 0.33274777 -0.16525912 -0.712255 -0.37436395 -0.48237688 -1.77934276 -0.8181882 0.18564964 0.42333525 0.85046899 -1.41237427 -0.90589871 0.29309088 1.93621144 -1.29793247 0.60649278 0.50386246 -0.98540444 -0.15027369 -0.28208615 -0.10324082 0.39118048 -1.54055852 0.85125075 -1.23926884 -0.83343008 1.59332627 0.9631633 -0.38723825 1.1698014 -1.36597437 -0.17797554 0.00417128 0.10301713 -0.09963567 0.91451219 0.05826495 -0.23087164 -0.51211332 2.23235319 -1.06802284 0.10515829 0.74006167 1.08490606 0.66140449 0.82576775 -0.61896308 -1.11808764 0.78868065 1.80840509 1.59706215 -0.10830501 -0.43390958 0.10243823 1.18471664 -0.21509326 -0.06924863 0.42381954 -1.78155467 0.72725472 -0.42389957 -0.91638808 1.64849617 -1.56222047 0.19197186 -0.4202366 -1.10523176 0.38449808 -1.07350163 0.92764022 0.05048934 -0.68697915 -0.26037699 -0.23738937 -0.31995942 -0.03884073 -1.06312134 0.00711695 -0.21942666 -1.57243435 1.29464108 0.28632258 -0.52050134 0.04104968 -0.15648894 -0.45969876 0.78041537 0.66571461 -0.02964889 -0.81510251 -0.18221871 -0.59851631 -2.28785778 1.21884933 0.29842662 -0.27536184 1.1525265 0.49523292 -0.14864148 -1.37870516 1.01219839 -0.17300663 0.70402574 -0.0967208 0.69966898 -0.35981022 -2.64398001 -1.8840764 1.03028906 0.65349545 0.95246202 -0.23055191 1.81569855 -1.16764801 -0.21284815 0.49500065 0.14200878 1.51209272 0.69628515 0.34936632 -1.71508629 -0.44391777 -0.64517119 0.23979253 0.73604971 0.39322503 -0.25929264 -1.18555563 -1.22814231 0.75128283 -1.03651419 0.44410412 0.29738608 -0.96386093 0.25906646 1.11236178 -0.86205045 0.53394909 0.44293056 -0.79260064 -0.60263913 -0.09833728 0.44773139 0.59636382 -0.18422784 -1.16339203 0.84240759 0.51291926 0.31432835 0.50088871 1.02228069 -0.62911265 0.10903904 0.14831584 0.15313003 -1.53878648 1.35202363 -0.93045261 -0.85343444 -0.29100418 1.13413216 -0.57314855 -0.25013598 -1.84522041 -0.96079646 -0.75182102 -0.54257906 -0.69012685 -0.98447929 0.06094801 -0.13614335 0.35311821 -2.21526134 -1.56330174 0.95077427 -1.03879141 -0.38403756 0.59064332 -1.33629359 -0.29582076 0.78079396 -0.61856781 -0.74181386 -0.64930469 1.9676638 0.60882074 -0.48454208 -0.84525651 1.23964521 -0.7469353 -0.39709727 0.10818972 0.39843691 1.58445223 -0.05606623 0.4836051 0.12764768 1.50113878 -0.33205773 2.1898563 -0.9541289 1.1569043 1.93951483 0.92184993 0.77448844 0.30161686 1.08730429 -0.16669671 -0.11320989 -2.0545252 -0.62944903 -0.34570173 -1.21227292 0.32252963 -0.01362825 1.15129225 0.24730609 -0.69215291 2.08934591 -1.59131867 1.70470296 0.49432063 -0.08252712 0.6077468 -0.17713871 1.10369024 -0.54825016 -0.19839313 -0.62194937 1.19578927 1.44922243 -1.28971181 1.50767688 0.53496321 0.07281144 0.35687549 0.347865 -0.07376886 1.65079319 -0.85451361 1.25846879 1.67356535 1.01191428 0.31426198 -0.6276547 -1.14350412 -1.1761836 0.2905135 0.2625313 0.15292861 2.08383176 0.36661707 0.43857974 1.26757456 0.5068178 -1.67996987 0.28503587 0.39038129 0.32378927 0.50728263 -0.50875632 0.85744385 0.87446554 -1.07345047 0.43638152 -0.78583085 0.06820012 0.21781249 1.0009311 1.67046836 0.95394562 -0.23141301 -0.41826692 -0.23284108 -1.27750678 -1.34222146 1.23752263 -1.19143882 1.15553619 1.13927566 0.82859042 -0.40308094 0.43262075 1.90424118 0.78319487 1.00641881 -1.6383208 -0.66543285 0.54624614 -0.2487769 -0.26006423 -0.57959374 -0.35769829 -1.04274811 0.40738465 0.1239669 1.83041921 -0.0037644 0.413292 1.02371058 -0.69557644 0.16604832 -0.1510843 -1.46313224 -0.6184819 -0.34202584 0.12174538 -0.29987945 0.03239938 -0.52094473 0.08674988 -0.26738065 -0.94458949 0.57634809 -0.46962846 -1.13307306 -0.14567928 1.12486919 -0.56963658 -0.81442109 0.01896236 -1.01928334 -1.14972392 -2.2471505 0.34455905 -0.26164853 -1.48180792 2.21215316 -0.17793532 -0.27385086 -0.72174512 -0.31910869 0.34347888 1.66353869 0.34157022 -0.68152637 -0.23908724 -0.11608138 1.34259651 0.642841 -0.21728542 1.17904876 2.65980981 -0.93650535 1.79621866 -0.3900258 0.33731583 1.58659685 0.29415686 -0.80807789 -0.40287469 -0.86993938 0.05175415 0.97970872 -2.29317475 0.90755465 -1.04687286 0.0775981 0.33924321 -0.81780114 -0.26624527 -1.17777291 1.65215269 1.02113956 -0.64556861 0.11360104 0.99474414 -0.16988551 -0.10957305 0.10999526 2.0058856 0.34093438 -0.7360299 -0.30905845 -0.40148114 0.09720633 0.79353066 0.12025606 -0.27005486 -1.38208709 0.4472569 -0.01821961 0.30987194 1.96599011 0.79983434 -0.4263662 -0.21106727 -1.04864897 -1.3805941 0.54534129 -1.31761727 1.55991565 -0.28950751 0.27902623 -0.24636672 -1.00123896 1.86078769 1.42942424 -0.54962336 -0.14789866 -2.79830024 1.48365501 -0.82318698 -0.88815562 -0.0979308 0.87476117 -1.48303845 1.31132094 -1.40665971 -0.03551798 -1.36532849 -0.54450632 -1.49687194 0.58640162 0.30966597 1.07816012 -1.36590226 1.4468764 1.74609226 0.36474479 -0.39363952 0.96871912 0.79109358 -1.37241452 1.08091894 0.38499821 1.44711049 -0.26486458 1.03642214 -0.11833119 1.34527966 -0.48671485 2.31741311 -0.13388724 0.27507134 -0.1522832 -0.15754887 0.27192342 -0.91779988 0.6137094 0.03846016 -0.01191439 1.09235326 0.64921249 0.82848922 -0.96246091 -1.4259184 -0.18366661 0.12237175 -1.75874128 -2.53795534 -0.4958582 2.06706275 -0.0529587 1.31285375 1.11239121 2.31340974 0.34280977 0.00939006 1.10887579 1.72324627 0.98827712 -1.46521866 -0.5769626 -1.02792297 -0.88687479 -0.52716508 1.0844781 0.7930864 0.47481628 -0.24273474 2.69917912 -0.49854155 0.38092155 1.24439773 0.20061335 -0.5691263 0.68704725 0.48682522 0.95684879 1.18237761 -1.20397935 -0.4122171 -0.2443072 0.70799376 -0.11631079 0.38848278 -0.46430024 0.41688461 0.97361372 -0.42984036 -1.46131278 -0.70429626 0.98823847 -0.65734112 0.86947284 -0.856251 -2.57615544 1.32044796 -0.19601104 -0.86198681 -1.71670566 1.13148721 -0.7155255 0.06429033 0.52674645 0.67655816 -1.69644799 -0.26250063 -0.84617055 0.07942076 0.02939289 -0.36042967 -0.40584252 -0.42264285 0.65087997 1.45819569 -1.45265946 -1.94336159 -0.83702843 0.67184042 0.17797882 1.14379194 0.21755448 0.26054582 0.11783708 0.95328199 -1.13998491 -0.43229171 -0.93300822 -0.0352043 0.62025586 0.54364651 -0.32362174 -0.39449883 -1.33675768 -0.09687463 -0.31984737 0.2436616 -0.54351385 0.24673171 2.19238832 0.77923656 -0.23640258 -1.15838356 -0.03006088 0.68786952 0.17150749 0.81462478 -0.85541114 0.99874342 -2.15965146 -1.03881024 0.50757521 -1.19341276 -2.05424767 -0.64793576 0.65303167 -0.27191581 0.18566117 1.01685423 -0.31818198 -1.01750303 -0.77053815 -1.04493985 -0.84971712 -1.0622343 0.49499478 -1.60097093 0.43969986 0.43056168 -1.09630862 -0.58438236 -0.7346395 -0.19476648 -0.48083015 0.15888337 -0.32956055 -1.14123106 -0.08536761 -1.44645023 0.9623507 -1.45051778 0.0877504 1.03761996 1.62630131 0.90123559 2.76561565 0.3562229 0.82852851 0.60034888 0.53798763 0.1564027 -1.46602866 1.18198869 -0.83142085 -0.62994097 0.4927252 -1.83320793 0.01194093 -0.43775179 -0.27020503 0.07225823 -0.20558826 -0.76261262 -0.61835321 1.24310068 0.33971641 -2.35889141 1.08280614 0.87438358 -1.41750279 0.24116423 2.98805641 -0.55380819 -0.71573383 -1.98815457 1.92113827 -0.05940101 -0.51424034 -0.79104346 0.51741134 -0.77440025 -1.2792422 -1.31206217 0.39603096 0.27717857 -1.75802882 0.99599155 -0.77508479 1.22620005 -0.63583273 -0.41012152 1.6223275 -0.95035798 -0.35223839 0.41895622 1.27057547 2.25623839 -0.0501627 -0.62905562 -0.19514049 -0.38490125 -1.34839915 0.76556625 -1.60050153 0.30045762 0.00550895 -0.67350912 -0.61742533 -0.57544133 2.23890507 -1.85379767 0.66494818 0.05499107 -0.94959119 0.42038338 -0.47030965 1.09679841 -0.13147246 0.55021063 -0.21884872 -0.59910398 0.96811718 -2.25860842 1.05945015 2.1259497 -0.57198864 1.13143092 0.6516915 -0.04444635 -0.26328845 -2.15556369 0.7740032 2.00114764 0.39145281 0.45289317 0.82252659 -0.57928716 -0.37598153 -2.07778853 -0.86607008 0.0696843 1.15468849 0.09588248 0.88346835 -0.18559924 0.99256573 -1.38979495 -0.2812633 -0.8896644 0.49702012 1.38750051 1.90658722 0.06337557 -0.32658291 -2.1288008 -0.02813919 0.86678649 0.32520381 0.28904867 -0.27488424 -1.04666344 0.17655225 -0.34565069 -0.8360466 -0.99504683 0.45588389 -0.94420032 0.82072522 1.45452108 0.50657976 -0.69587969 0.64168953 0.19670613 -1.88195504 0.98076705 1.81527615 0.20257097 -0.35748962 0.72506102 0.24887781 -0.1137756 -0.10927373 0.08326408 -0.29121847 -0.34556396 -0.98861983] 0.0
# Construct Theano expression graph
p_1 = 1 / (1 + T.exp(-T.dot(x, w) - b)) # Probability that target = 1
prediction = p_1 > 0.5 # The prediction thresholded
xent = -y * T.log(p_1) - (1-y) * T.log(1-p_1) # Cross-entropy loss function
cost = xent.mean() + 0.01 * (w ** 2).sum()# The cost to minimize
gw, gb = T.grad(cost, [w, b]) # Compute the gradient of the cost
# (we shall return to this in a
# following section of this tutorial)
# Compile
train = theano.function(
inputs=[x,y],
outputs=[prediction, xent],
updates=((w, w - 0.1 * gw), (b, b - 0.1 * gb)))
predict = theano.function(inputs=[x], outputs=prediction)
# Train
for i in range(training_steps):
pred, err = train(D[0], D[1])
print "Final model:"
print w.get_value(), b.get_value()
print "target values for D:", D[1]
print "prediction on D:", predict(D[0])
Final model: [ -1.81131249e-01 7.91474551e-02 -1.44525934e-01 -1.00037207e-01 1.04376413e-01 7.24664898e-02 -2.65346334e-02 5.74266922e-02 4.42154749e-02 -8.14707713e-03 1.52100176e-01 1.06668865e-01 1.69001666e-01 8.38883510e-02 -1.47190716e-01 -1.27891085e-01 -7.11170825e-02 -1.66955393e-02 -7.55397395e-03 -1.77481040e-01 6.56628547e-02 1.99854380e-01 1.43514942e-01 -1.48513939e-01 1.68757308e-02 -1.47978583e-01 -6.67616177e-02 -4.64896635e-03 1.17690142e-01 1.06428018e-01 2.95507573e-02 -8.96944360e-02 -3.72206422e-02 -1.81272779e-02 7.50819753e-03 4.77551512e-02 -7.86157462e-02 3.99775263e-02 5.41979454e-02 -7.07490788e-02 -5.04209744e-02 5.02318171e-02 -3.83757740e-02 -9.00337194e-02 1.28004116e-01 1.44180562e-01 -1.70874519e-01 2.07105118e-02 2.51054609e-03 -9.05010910e-02 9.86029481e-02 1.48841783e-02 7.70972541e-02 1.03998241e-01 1.00480749e-01 -5.88197873e-02 -3.69735828e-02 -7.26242204e-04 3.03649893e-02 1.81915330e-01 -1.08693837e-01 4.26702177e-03 1.30665216e-01 9.06356321e-02 -1.33682652e-01 9.50653079e-02 -4.11087174e-02 -3.15690314e-02 -1.06792190e-01 -1.58299636e-02 -2.30023750e-01 -8.89235320e-02 -1.05818718e-01 1.09694858e-01 -1.08043009e-01 1.29380939e-03 5.18552633e-02 -1.64292465e-02 -5.86858093e-02 4.33470569e-02 -2.64193697e-02 -1.13978280e-02 -1.55994954e-01 1.03534166e-01 1.44543446e-01 -9.81152567e-02 -6.08694159e-02 3.78063813e-02 1.09555293e-02 -1.72750183e-01 1.23213310e-01 5.05796357e-02 -2.91464639e-02 -3.11824810e-03 1.31296855e-01 -2.30470591e-02 1.09341962e-01 2.56571404e-02 1.10685405e-01 4.11523495e-02 -1.61453225e-02 -5.56234939e-02 -1.65986365e-02 9.04702274e-02 8.16028067e-02 3.29849226e-02 1.43085088e-02 6.94645711e-02 -8.95794098e-02 4.26318824e-02 4.15372263e-02 -4.35485666e-02 -1.62766305e-01 4.94834803e-02 -1.93281426e-01 1.40535794e-01 -2.47288283e-01 -1.45493862e-02 6.80849834e-02 -2.22975591e-02 2.17145571e-02 -1.81218083e-01 2.42179345e-01 7.88169860e-02 1.03931821e-01 -1.47115472e-01 -9.65287367e-02 -3.39391064e-02 -4.50501606e-02 2.53001817e-02 7.40736539e-02 1.33984020e-01 -1.48957221e-01 -9.63605490e-02 1.50770535e-01 -3.20135709e-02 -6.28114466e-02 -7.39508806e-02 -6.33600072e-02 -1.76470482e-01 -2.62497878e-01 8.91769569e-02 2.84602530e-03 1.36192822e-01 7.02249070e-03 -6.98251463e-04 -1.33963535e-03 -4.58613063e-02 1.11767825e-01 9.95364984e-02 2.48801767e-01 4.32801092e-02 1.11746324e-01 -8.86090226e-02 5.50554138e-02 -1.83248288e-01 6.94916026e-03 -8.82354057e-02 9.13544132e-02 3.09292296e-02 6.52085540e-02 3.38108981e-02 6.33266112e-02 5.51365941e-02 -2.04171488e-01 -2.35041748e-01 -1.39967093e-02 4.51219286e-02 7.70896213e-02 4.23673159e-02 -1.30510484e-01 2.12516725e-02 2.02279320e-01 1.44093670e-02 -1.70519405e-01 1.17724290e-01 -3.71137841e-03 -2.75783074e-02 4.56548972e-02 -9.45996547e-02 9.88792542e-02 -5.31573947e-02 -1.19594353e-01 -5.60769315e-02 -9.60168084e-02 -2.35145552e-02 -6.18206384e-02 -1.99403822e-02 1.63521359e-01 -7.39565014e-02 -4.86123651e-03 2.21532755e-02 -3.49816758e-02 -3.86880288e-02 1.72363111e-01 6.25474188e-02 7.55601501e-05 2.08987190e-01 1.11883358e-01 -1.05270910e-01 -5.73230973e-02 -8.07751835e-02 -3.92437767e-02 -3.44606801e-02 -1.73988164e-01 -9.24360332e-02 7.16072061e-03 -1.59170643e-03 6.78167610e-02 -1.16837092e-01 -6.12144095e-02 -7.97137597e-02 8.70655177e-02 -2.07097456e-01 3.91440552e-02 1.37832066e-02 3.08166210e-02 1.02248730e-01 -8.53672323e-02 -1.12763278e-01 2.04356845e-02 -1.07397323e-02 -4.17140854e-02 1.49235432e-02 7.87417786e-02 -4.16526083e-02 7.52986893e-03 2.17341637e-02 5.62515177e-02 -1.22462748e-01 8.46239436e-02 -9.88434507e-03 1.26481130e-01 -2.28767006e-01 -3.64394403e-02 -4.04072629e-02 -1.23038675e-01 4.47541286e-02 -2.39368037e-01 -1.34179850e-01 -6.72523684e-02 3.47381415e-02 1.13519062e-01 9.53794428e-02 4.20272755e-02 7.82980096e-02 -4.31013053e-03 9.79303727e-03 1.10126520e-01 2.23213272e-02 -3.55838128e-02 -1.50069552e-01 -1.09605128e-01 3.12561509e-02 1.33032108e-01 -7.91370482e-02 1.02849376e-02 -5.77070010e-02 -1.45987690e-01 1.03991494e-01 6.48460127e-02 2.17391374e-01 4.15534691e-02 -6.67723539e-02 -5.04719330e-02 1.04585737e-01 -2.31955743e-02 -5.65463344e-02 -6.04745899e-02 1.31614357e-01 -1.98858162e-01 -5.88894906e-03 4.29313327e-02 -1.79435509e-01 8.58075346e-02 3.71963111e-02 3.59804314e-02 1.28440152e-01 7.06070661e-02 -6.48168569e-02 -1.08599284e-02 -8.21123694e-03 -4.37901947e-03 3.37001010e-02 -2.67596749e-02 1.65202775e-01 1.11453917e-01 -7.70137245e-02 -5.40180625e-02 5.84861937e-02 -6.23891814e-02 -2.81030307e-01 1.51930676e-01 1.02267529e-01 -3.66363119e-02 1.40119598e-01 7.34895496e-03 -4.86902011e-02 -1.56690752e-01 2.62792393e-02 -5.45757341e-02 -1.34193623e-01 8.32067664e-02 -1.35540654e-03 1.62711086e-01 6.60433610e-02 -1.50162348e-01 1.91286488e-01 -6.28556595e-02 1.76153927e-01 7.12130893e-02 -9.79890119e-02 1.15796117e-01 5.77183719e-02 -1.01656886e-01 -9.77137753e-02 -2.35512434e-02 -4.99995166e-02 1.33721131e-01 3.26306791e-01 1.48333767e-02 8.96648617e-02 -1.26560641e-01 -5.20781661e-02 -1.43846717e-01 8.27597406e-02 1.58231759e-01 2.41199935e-02 7.95141434e-02 -4.84137372e-02 6.78232398e-02 1.87440791e-03 -2.26338901e-02 8.18726268e-02 1.53170449e-02 -2.07973135e-02 -1.36620838e-01 -1.56291796e-01 -3.99042043e-02 6.15614367e-02 -5.35059417e-02 8.74395756e-02 2.53728550e-01 1.00491863e-02 1.26126452e-01 1.83545797e-02 -1.07799328e-01 -3.05667097e-02 -2.17675452e-01 -2.62331902e-03 1.22171083e-01 2.79085613e-02 8.54812762e-02 -6.93872143e-03 -1.74249289e-02 -1.77740415e-01 3.42430975e-02 -4.61283267e-02 -6.51231336e-02 2.77151975e-02 -2.04961572e-01 1.15299052e-01 -4.10149315e-02 9.34423672e-02 -1.38779103e-01 5.24375736e-02 -8.23191340e-02 8.47574337e-03 3.96779240e-02 1.78308695e-02 1.83432741e-02 6.58824338e-03 -1.60597187e-01 1.90665647e-01 1.39145757e-01 -9.89335065e-02 -1.83829876e-01 3.68737950e-02 1.69436076e-01 -6.02690930e-02 1.46712094e-02 3.40071597e-02 1.85079644e-01 -1.26543070e-03 8.30166998e-02 1.68343934e-01 -9.18067724e-02 -2.41752013e-02 1.33232420e-02 2.04381436e-01 6.73316340e-02 -2.38476413e-02 2.67965224e-02 3.42862201e-02 1.24748797e-01 6.18911898e-03 1.32434084e-02 -2.04287968e-01 7.24636731e-03 1.69090062e-01 -5.27520012e-02 2.29027217e-01 1.22294694e-01 1.27752092e-02 1.17822671e-01 7.70884227e-02 -1.10265873e-01 -2.97052751e-03 2.00538175e-01 -7.89841915e-02 -9.67337507e-02 2.94275541e-02 -9.48466887e-02 -9.70851316e-02 -6.36750342e-02 2.34039471e-02 -7.04313842e-02 -1.04270108e-01 1.25657008e-01 -5.40598234e-02 -3.27715033e-03 4.25086298e-02 -2.20319345e-01 -6.83687864e-02 -8.01702682e-02 1.52124930e-01 1.47813504e-01 -6.57658957e-03 -3.14662213e-02 -8.26268791e-02 7.83307806e-02 6.97923986e-02 -3.49203482e-02 1.88400216e-02 -4.62695345e-02 -1.43958723e-01 -2.39758641e-02 -5.46471890e-02 -2.03478220e-01 -1.02799902e-01 -5.55954669e-02 5.94394619e-02 2.12579799e-02 9.66612090e-02 -1.12863536e-01 -1.15084076e-01 -2.95924440e-02 -4.30465606e-02 -2.01513460e-01 3.58933706e-02 2.63395176e-01 2.60223678e-02 5.71665291e-02 4.09525617e-02 5.67773509e-02 1.46394195e-01 -2.72669285e-02 -1.11979795e-01 1.25357440e-01 9.38702468e-02 7.62972097e-02 3.63150539e-02 5.88111400e-02 -4.70178296e-02 1.17519166e-01 -1.81626295e-01 -1.54121705e-02 -4.84468407e-02 9.89947986e-02 -1.03307283e-01 -1.41978971e-01 -1.22667750e-01 1.45617720e-02 1.68106377e-02 -6.33283581e-02 -4.96240890e-02 -4.48374696e-02 1.75099352e-02 -5.74540554e-03 1.70180988e-02 1.21898666e-01 -1.44610211e-01 -7.57346148e-03 -3.46697431e-02 3.71605464e-02 3.53644498e-02 -1.37665735e-01 -3.76142179e-02 1.79133311e-03 5.94094475e-02 -1.73607928e-02 4.97697833e-02 -6.48442281e-02 -2.21525622e-02 4.15572481e-02 -4.11893818e-02 -1.58609723e-01 1.74161302e-02 -5.85717308e-02 -8.00478082e-02 -1.19242956e-02 3.57070702e-02 -1.51728867e-01 -1.16502621e-02 -9.38697890e-02 5.84831537e-02 4.79377687e-02 -8.22891530e-02 -8.14789835e-03 -3.59765391e-02 -1.27445541e-01 5.48458910e-02 -1.32376954e-02 9.25868646e-02 -4.25604084e-02 8.18512875e-02 8.39485031e-02 -1.54285986e-01 -9.46642901e-02 1.37082480e-01 7.47522124e-02 1.49360939e-01 1.22671257e-01 -6.54996308e-02 -3.50518503e-02 8.31842658e-02 -6.99593535e-02 1.71013127e-01 2.98029662e-02 -9.26677414e-03 6.76749827e-02 -1.07683334e-01 -6.84375747e-02 1.40756700e-01 -2.48898115e-02 1.09079937e-01 2.29344741e-02 -4.12371571e-03 4.22777504e-03 4.43217151e-02 8.36677372e-03 6.24903765e-02 -7.94140107e-02 -9.15316889e-02 5.93065776e-02 -8.83564096e-02 -2.57593746e-01 -1.75396862e-02 9.42109743e-02 -7.99567485e-02 5.52327805e-02 -8.24477619e-02 -2.68286037e-02 8.96369519e-02 -5.59397837e-02 -9.56166378e-02 -7.28699745e-02 7.84901543e-02 5.63978157e-02 -1.39259203e-01 4.24721852e-02 2.12530827e-02 1.65111719e-03 5.55051080e-04 -1.20766957e-01 -1.26087705e-01 4.72899953e-02 3.30503781e-02 -3.89881140e-02 -4.01387631e-02 -8.79888503e-02 5.90997981e-02 9.36665833e-02 7.18816424e-02 -2.25732405e-02 1.34297155e-01 -4.99305205e-02 -6.11497144e-03 4.65147376e-02 -1.12692999e-01 6.05047499e-02 6.76369475e-02 -3.59288327e-02 -7.86224641e-02 4.78247065e-02 -4.87057448e-03 3.65090653e-02 -2.30487240e-02 1.74956667e-02 4.93024197e-02 1.07560790e-01 -3.62250290e-03 1.06306034e-01 -4.14003917e-02 4.66053875e-02 -6.01462386e-02 -4.11354690e-02 7.53438899e-02 -1.21215095e-01 1.99131332e-02 1.22976390e-01 1.07075154e-02 -2.04643058e-01 -9.77739716e-02 -3.95987362e-02 -7.44341461e-02 -3.87780387e-02 -9.55923340e-04 1.67576692e-01 1.02268862e-01 5.30817689e-02 -1.19976970e-01 7.38165293e-02 -7.85480027e-02 1.37899267e-01 4.62046326e-02 -1.56416871e-01 -6.14786032e-02 9.70988744e-02 -1.03570355e-01 -5.60655911e-02 -2.45385203e-02 4.28169220e-02 1.38725375e-01 -7.74169224e-02 -3.70229717e-02 -5.83579591e-02 -1.27498544e-01 -1.13488397e-02 5.82218944e-02 4.70351739e-02 -8.51439688e-02 3.44249896e-02 -8.03881065e-02 -1.09127709e-01 -7.99014369e-02 1.44608812e-01 -5.25491948e-02 1.86106223e-01 4.66345849e-02 4.62471591e-02 -1.94123765e-01 6.61795163e-02 -5.82147105e-02 -9.95334787e-02 -5.27237883e-02 4.55128994e-02 3.76960434e-02 -8.62556194e-02 4.51697036e-02 -6.28257637e-02 -3.91553111e-02 6.85206036e-02 -4.02985181e-03 -2.59430489e-02 1.69406845e-02 -6.34291725e-02 -2.01836220e-01 2.04586151e-02 1.93044778e-02 9.18824268e-02 7.80716709e-02 -8.19215576e-03 -2.11391344e-02 -6.17037868e-02 -3.21413482e-02 1.43183967e-01 4.13658989e-02 6.62750306e-02 9.46856442e-03 -9.62361177e-02 -1.08861985e-01 1.31504319e-02 5.26580500e-03 3.32621084e-02 -5.90274444e-02 2.21386942e-02 7.61014162e-02 -3.92436589e-02 1.72858633e-01 -1.30935771e-01 1.91053009e-02 -1.24216870e-01 -4.86304716e-03 -7.52495860e-03 -8.78454657e-03 2.48075791e-03 5.70526248e-03 4.34559647e-02 -8.02769941e-02 -1.03493464e-02 -8.44647400e-02 -4.51197737e-02 3.91102723e-02 8.50648883e-04 3.68686881e-02 -9.09623989e-02 2.85286794e-02 -1.10640346e-02 1.35626540e-01 -7.34356980e-02 3.53646871e-02 -4.40257068e-02 -1.87644444e-02 1.01197904e-02 -9.59573407e-02 -1.89288469e-01 -8.83272295e-02 -1.30018435e-02 -2.04296414e-02 8.31758102e-02 2.84935934e-02 -1.34480312e-01 -1.93937401e-02 4.16708883e-02 -3.83633426e-02 -2.94449361e-02 -2.41089615e-01 7.43472847e-02 7.86225064e-02 -1.19539752e-01 2.23546246e-02 7.92621003e-02 -3.45594589e-02 6.94810283e-02 -2.79799337e-02 -6.89861574e-02 6.67692377e-02 8.01262678e-02 -9.66223626e-02 -1.66397983e-01 -2.32137901e-02 -6.53721462e-02 1.13545562e-01 3.39471992e-02 1.37976129e-01 -5.41966519e-02 1.24488924e-01 -6.96033548e-02 -1.72336279e-02 -2.34185712e-02 -1.10674408e-01 -1.57840345e-01 1.27335782e-02 1.36231672e-02 7.95626880e-02 -3.85937588e-02 1.43361179e-01 -6.99530472e-02 1.66651559e-01 -6.96706676e-02 -9.74931803e-02 5.31800209e-02 6.71720528e-02 6.84868273e-02 2.53509975e-01 -2.67043600e-02 -1.12129445e-01 2.64757594e-02 5.03741541e-02 8.43458029e-02 -2.16696325e-02 -8.24882609e-02 5.44607714e-02 2.56168179e-02 -1.10426559e-01 8.68597432e-02 -1.02022426e-01 -1.24813737e-01 1.12738666e-01 1.19114392e-01 6.23285982e-02 3.58147245e-02 -5.71402361e-02 6.05233365e-02 -5.70675622e-02 -2.91292396e-02 2.98498874e-02 1.40380333e-02 -3.45429015e-02 -3.84995663e-02 2.40452177e-01 4.64017277e-02 -3.22607263e-02] -0.0716169629983 target values for D: [1 0 1 1 0 1 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1 1 1 0 1 1 0 0 0 1 0 0 1 0 0 1 1 1 0 1 1 1 0 0 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1 0 0 0 1 0 0 0 1 1 0 0 0 1 1 1 0 1 1 0 1 0 1 0 0 1 0 1 0 1 1 1 0 0 1 1 1 0 1 0 1 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 0 1 1 1 1 1 0 0 1 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 0 0 1 1 1 0 0 0 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 1 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1 0 1 1 0 0 1 0 0 1 0 0 0 1 1 0 1 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 0 0 1 1 1 0 1 0 1 1 0 1 0 0 0 0 0 0 1 1 1 1 0 0 0 1 0 1 1 0 1 0 0 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 1 1 1 1 1 0 1 0 0 1 0 1 0 1 1 0 1 1 1 0 1 0 1 0 0 1 1 0 1 0 0 0 1 1 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 1 0 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 1 1 0 1 1 1 0] prediction on D: [1 0 1 1 0 1 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1 1 1 0 1 1 0 0 0 1 0 0 1 0 0 1 1 1 0 1 1 1 0 0 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1 0 0 0 1 0 0 0 1 1 0 0 0 1 1 1 0 1 1 0 1 0 1 0 0 1 0 1 0 1 1 1 0 0 1 1 1 0 1 0 1 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 0 1 1 1 1 1 0 0 1 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 0 0 1 1 1 0 0 0 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 1 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1 0 1 1 0 0 1 0 0 1 0 0 0 1 1 0 1 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 0 0 1 1 1 0 1 0 1 1 0 1 0 0 0 0 0 0 1 1 1 1 0 0 0 1 0 1 1 0 1 0 0 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 1 1 1 1 1 0 1 0 0 1 0 1 0 1 1 0 1 1 1 0 1 0 1 0 0 1 1 0 1 0 0 0 1 1 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 1 0 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 1 1 0 1 1 1 0]