Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery Starbot ⭐ refactored manncodes/xeno #1

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions xeno/activations.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def forward(self, input):
return np.maximum(0.0, input)

def derivative(self, input=None):
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ReLU.derivative refactored with the following changes:

res = np.zeros(last_forward.shape, dtype=get_dtype())
res[last_forward > 0] = 1.
return res
Expand All @@ -76,7 +76,7 @@ def forward(self, input):
return input

def derivative(self, input=None):
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Linear.derivative refactored with the following changes:

return np.ones(last_forward.shape, dtype=get_dtype())


Expand All @@ -93,11 +93,10 @@ def forward(self, input):
self.last_forward = input
x = input - np.max(input, axis=1, keepdims=True)
exp_x = np.exp(x)
s = exp_x / np.sum(exp_x, axis=1, keepdims=True)
return s
return exp_x / np.sum(exp_x, axis=1, keepdims=True)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Softmax.forward refactored with the following changes:


def derivative(self, input=None):
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Comment on lines -100 to +99
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Softmax.derivative refactored with the following changes:

return np.ones(last_forward.shape, dtype=get_dtype())


Expand Down
2 changes: 1 addition & 1 deletion xeno/initializations.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def decompose_size(size):
fan_in = size[0]
fan_out = size[1]

elif len(size) == 4 or len(size) == 5:
elif len(size) in [4, 5]:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function decompose_size refactored with the following changes:

respective_field_size = np.prod(size[2:])
fan_in = size[1] * respective_field_size
fan_out = size[0] * respective_field_size
Expand Down
10 changes: 4 additions & 6 deletions xeno/layers/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,7 @@ def forward(self, input, *args, **kwargs):

self.last_input = input
linear_out = np.dot(input, self.W) + self.b
act_out = self.act_layer.forward(linear_out)
return act_out
return self.act_layer.forward(linear_out)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Dense.forward refactored with the following changes:


def backward(self, pre_grad, *args, **kwargs):

Expand Down Expand Up @@ -128,11 +127,10 @@ def connect_to(self, prev_layer):
def forward(self, input, train=True, *args, **kwargs):

if 0. < self.p < 1.:
if train:
self.last_mask = get_rng().binomial(1, 1 - self.p, input.shape) / (1 - self.p)
return input * self.last_mask
else:
if not train:
return input * (1 - self.p)
self.last_mask = get_rng().binomial(1, 1 - self.p, input.shape) / (1 - self.p)
return input * self.last_mask
Comment on lines -131 to +133
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Dropout.forward refactored with the following changes:

else:
return input

Expand Down
5 changes: 1 addition & 4 deletions xeno/layers/normalization.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,10 +92,7 @@ def backward(self, pre_grad, *args, **kwargs):
dmu = -1 * np.sum(dxmu1 + dxmu2, axis=0)
dx2 = 1. / N * np.ones((N, D)) * dmu

# step0 done!
dx = dx1 + dx2

return dx
return dx1 + dx2
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BatchNormal.backward refactored with the following changes:

This removes the following comments ( why? ):

# step0 done!


@property
def params(self):
Expand Down
8 changes: 2 additions & 6 deletions xeno/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,7 @@ def fit(self, X, Y, max_iter=100, batch_size=64, shuffle=True,
else:
valid_X, valid_Y = None, None

iter_idx = 0
while iter_idx < max_iter:
iter_idx += 1

for iter_idx in range(1, max_iter + 1):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Model.fit refactored with the following changes:

# shuffle
if shuffle:
seed = get_rng().randint(111, 1111111)
Expand Down Expand Up @@ -142,8 +139,7 @@ def predict(self, X):
x_next = X
for layer in self.layers[:]:
x_next = layer.forward(x_next)
y_pred = x_next
return y_pred
return x_next
Comment on lines -145 to +142
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Model.predict refactored with the following changes:


def accuracy(self, outputs, targets):
y_predicts = np.argmax(outputs, axis=1)
Expand Down