Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating Beta branch from main #36

Merged
merged 54 commits into from
Oct 18, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
42f93d1
Updating The Array Module
erfanzar Oct 3, 2023
62b031f
Merge branch 'mojo-beta' of https://github.com/erfanzar/EasyDel into …
erfanzar Oct 3, 2023
390b9c7
Adding Dynamic Array Indexing for NDArrays
erfanzar Oct 3, 2023
97fc065
fixing mistral models
erfanzar Oct 6, 2023
ac6413a
Fixing `Transform-Mistral`
erfanzar Oct 6, 2023
ea90e62
Updating Packages (Now NDArrays works with `Mojo-0.4.0`)
erfanzar Oct 6, 2023
07e52f5
Adding `LLamaState`
erfanzar Oct 6, 2023
0e8d0ee
Todo: Create transpose func
erfanzar Oct 6, 2023
41be7fd
fixing some funny bugs
erfanzar Oct 7, 2023
7148091
Fixing `MistralForCausalLM` Bugs
erfanzar Oct 7, 2023
9bad45f
Fixing `MistralForCausalLM` Bugs
erfanzar Oct 7, 2023
040abd4
changing default DType for `MistralForCausalLM`
erfanzar Oct 7, 2023
249bc1f
changing default DType for `MistralForCausalLM`
erfanzar Oct 7, 2023
3351211
Adding `c_max_position_embeddings` for those who dont want to allocat…
erfanzar Oct 7, 2023
2b07926
Fixing Mistral Transofrm
erfanzar Oct 7, 2023
749ae13
Adding 3D transpose
erfanzar Oct 7, 2023
f5d0ae1
Update in Arrays
erfanzar Oct 8, 2023
4e7c8d5
Update JAXServer.md
erfanzar Oct 8, 2023
f3d24b4
Fixing A line Bug in `__elemet_wise_operations__`
erfanzar Oct 8, 2023
f061758
adding silu relu leaky_relu softmax
erfanzar Oct 8, 2023
1918b58
Changing Structer new version
erfanzar Oct 9, 2023
8a950f5
Adding Readable sign for PMJ
erfanzar Oct 9, 2023
7cb9a8f
Fixing Issue #32
erfanzar Oct 9, 2023
22f580f
Fixing Run Bugs
erfanzar Oct 9, 2023
c76cc5e
Addding DTypePointer funcs
erfanzar Oct 10, 2023
a782bf9
Changing Matmul Single Row
erfanzar Oct 10, 2023
dc0cfcf
Change code struct
erfanzar Oct 10, 2023
2fe30dc
Fixing Matmul Bug
erfanzar Oct 11, 2023
1f9f370
Adding attention_weight
erfanzar Oct 11, 2023
d2094f0
meking attention_weight more parallelized
erfanzar Oct 11, 2023
5fff3bf
Adding `dot_product_attention_weights` and `dot_product_attention`
erfanzar Oct 12, 2023
8a7a4c2
Addig `PointerOperations` changes in `Attention` and adding `view`,`i…
erfanzar Oct 12, 2023
4706f2a
Updating To Fix `Array` bugs
erfanzar Oct 12, 2023
23e2445
Fixing `Llama` Model issues
erfanzar Oct 12, 2023
c4f9b60
Fixing `Llama` Model issues
erfanzar Oct 12, 2023
ba8cbd8
Version `0.0.1` Mojo EasyDel With `Llama` Model 🔥
erfanzar Oct 12, 2023
b58db44
Merge pull request #33 from erfanzar/main
erfanzar Oct 12, 2023
5e6834f
Update README.md
erfanzar Oct 12, 2023
5fee8c0
Merge branch 'mojo-beta' of https://github.com/erfanzar/EasyDel into …
erfanzar Oct 12, 2023
bcb346f
Update README.md
erfanzar Oct 12, 2023
3d82ac1
Updating `Llama` Model
erfanzar Oct 12, 2023
d3a4441
Updating Llama Model
erfanzar Oct 13, 2023
cf325c1
Adding New Scheduler for optimizers `warm_up_linear`
erfanzar Oct 13, 2023
9ae2a33
Adding Llama weight convertor
erfanzar Oct 14, 2023
ed64f6e
Adding New Utils and making Arrays Faster
erfanzar Oct 14, 2023
f19093d
Debuging new fucntions
erfanzar Oct 15, 2023
4d4af29
Fixing Array issues related to `GetAxis` `Transpose` and adding more …
erfanzar Oct 15, 2023
c122587
making fill vectorized
erfanzar Oct 15, 2023
8301839
Fixing typing bug
erfanzar Oct 15, 2023
7916d1e
`TODO: Making the EasyDel Echosystem in mojo more like pytorch`
erfanzar Oct 16, 2023
c41f7d0
Fixing Embedding Layer
erfanzar Oct 17, 2023
1335143
Fixing Rope Bugs
erfanzar Oct 17, 2023
9ea94c1
Merge pull request #35 from erfanzar/mojo-beta
erfanzar Oct 18, 2023
24e7110
Debuging `Mistral` Models
erfanzar Oct 18, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,8 +177,8 @@ train_args = TrainArguments(
num_train_epochs=3,
learning_rate=1e-5,
learning_rate_end=1e-6,
optimizer='lion', # 'adamw', 'lion', 'adafactor','warm_up_cosine' are supported
scheduler='linear', # 'linear' or 'cosine' or 'none'
optimizer='lion', # 'adamw', 'lion', 'adafactor' are supported
scheduler='linear', # 'linear','cosine', 'none' ,'warm_up_cosine' and 'warm_up_linear' are supported
weight_decay=0.01,
total_batch_size=16,
max_steps=None, # None to let trainer Decide
Expand Down
56 changes: 44 additions & 12 deletions docs/Mojo/Array.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,46 +130,67 @@ Takes DType as dynamic Input like `Array[DType.float32]`

`fn __init__(inout self: Self, array_shape: ArrayShape):`

- Description: Init Array From ArrayShape(Alloc Zero).
- Description: Init Array From ArrayShape(Alloc Zero).

`fn __init__(inout self: Self, A: Self, B: Self) -> None:`

- Description: Init Array From Two other Arrays A and B For Matmul(Alloc One).
- Description: Init Array From Two other Arrays A and B For Matmul(Alloc One).

`fn __init__(inout self: Self, vl: VariadicList[Int]):`

- Description: Init Array from VariadicList[Int](Alloc Zero).
- Description: Init Array from VariadicList[Int](Alloc Zero).

`fn __init__(inout self: Self, init: Bool, *dim: Int) -> None:`

- Description: Init Array from Int Args(Depends on Passed Bool).
- Description: Init Array from Int Args(Depends on Passed Bool).

`fn __init__(inout self: Self, *dim: Int):`

- Description: Init Array from Int Args(Alloc Zero).
- Description: Init Array from Int Args(Alloc Zero).

`fn __init__(inout self: Self, value: DynamicVector[FloatLiteral], shape: ArrayShape) -> None:`

- Description: Init Array from ArrayShape and load data from DynamicVector[FloatLiteral](Alloc One).
- Description: Init Array from ArrayShape and load data from DynamicVector[FloatLiteral](Alloc One).

`fn __init__(inout self: Self, value: VariadicList[FloatLiteral], shape: ArrayShape) -> None:`

- Description: Init Array from ArrayShape and load data from VariadicList[FloatLiteral](Alloc One).
- Description: Init Array from ArrayShape and load data from VariadicList[FloatLiteral](Alloc One).

`fn __init__(inout self: Self, pointer: DTypePointer[T], *dim: Int) -> None:`

- Description: Init Array from IntArgs and load data from DTypePointer[T](Alloc One).
- Description: Init Array from IntArgs and load data from DTypePointer[T](Alloc One).

`fn __init__(inout self: Self, pointer: DTypePointer[T], *dim: Int) -> None:`

- Description: Init Array from given data from DTypePointer[T](Alloc Zero).

### Alloc

`fn alloc(inout self: Self) -> None:`
- Description: Allocate or Init The Array.

- Description: Allocate or Init The Array.

`fn alloc(inout self: Self, fill:SIMD[T, 1]) -> None:`

- Allocate or Init The Array and fill that with given fill number.

### Random

`fn random(inout self: Self) -> None:`

- Description: Randomize The Data if the Array is Allocated.
- Description: Randomize The Data if the Array is Allocated.

### Reshape and View

View:

* INOUT `fn view(inout self, *dims: Int):`
* View Change Shape totaly and don't care if the new shape fits or doesn't.

Reshape:

* INOUT `fn reshape(inout self, *dims: Int):`
* Reshape Change Shape totaly and check if the new shape fits or doesn't.

### Dim

Expand Down Expand Up @@ -203,7 +224,7 @@ fn load[

### Store Functions

```
```
fn store[
nelts: Int, off: Int
](self, index: InlinedFixedVector[off, Int], val: SIMD[T, nelts]) -> None:
Expand All @@ -225,6 +246,10 @@ fn store[

`fn __getitem__(self, index: Int) -> SIMD[T, 1]:`

`fn __getitem__(self, d1: Int, d2: Int, val:SIMD[T, 1]) raises->None:`

`fn __getitem__(self, d1: Int, d2: Int, d3: Int, val:SIMD[T, 1]) raises->None:`

### `__setitem__` Functions

`fn __setitem__(self, index: Int, val: SIMD[T, 1]) -> None:`
Expand All @@ -233,9 +258,16 @@ fn store[

`fn __setitem__[off: Int](self, index: StaticIntTuple[off], val: SIMD[T, 1]):`

`fn __setitem__(self, d1: Int, d2: Int, val:SIMD[T, 1]) raises->None:`

`fn __setitem__(self, d1: Int, d2: Int, d3: Int, val:SIMD[T, 1]) raises->None:`

### Math Functions

```mojo

fn argmax(self: Self, axis: Int = -1) -> Int:

# cos

fn cos(inout self: Self) -> Self:
Expand Down Expand Up @@ -379,4 +411,4 @@ fn acosh(inout self: Self, rt: Runtime) -> Self:
# Fill the whole array with the val

fn fill(inout self: Self, val: SIMD[T, 1]) -> None:
```
```
2 changes: 1 addition & 1 deletion docs/Python/JAXServer.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ you can use this function outside the class like this

```python
for string, num_used_tokens in server.process(
'im an string',
'im a string',
greedy=False,
max_new_tokens=256 # or None to use Maximum numbers passed in Config
):
Expand Down
2 changes: 1 addition & 1 deletion examples/training/causal-lm/gpt-j.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@
flags.DEFINE_string(
name="scheduler",
default='cosine',
help='which scheduler to use (available schedulers are cosine linear none warm_up_cosine)'
help='which scheduler to use (available schedulers are cosine linear none warm_up_cosine , warm_up_linear)'
)

flags.DEFINE_string(
Expand Down
21 changes: 20 additions & 1 deletion lib/mojo/EasyDel/__init__.🔥
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,32 @@ EasyDel Library [MOJO/EasyDel]
from .array import (
Array,
ArrayShape,
# Array Module
convert_numpy_to_easydel_array,
matmul_shape,
matmul_single_row,
matmul_2d,
matmul,
T_AXIS_0_2_1, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_AXIS_2_1_0, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_AXIS_1_0_2, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_2D, # LOW LEVEL AND PARALLELIZED , VECTORIZED
CAT_3D_AXIS_2, # LOW LEVEL AND PARALLELIZED , VECTORIZED
rotate_half, # LOW LEVEL AND PARALLELIZED , VECTORIZED
arange,
# Array Utils
softmax,
silu,
relu,
leaky_relu,
sigmoid,
sample_array,
# Funcs
PointerOperation,
# ED Struct
)
from .in_out import File, BufReader
from .tokenizer import Tokenizer, load_tokenizer, byte_pr_tokenizer_encoder
from .tokenizer import Tokenizer, load_tokenizer, byte_pr_tokenizer_encoder, loop_sort
from .utilities import (
read_numerical_value,
read_string_value,
Expand Down
10 changes: 10 additions & 0 deletions lib/mojo/EasyDel/array/__init__.🔥
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,16 @@ from .array_module import Array, ArrayShape
from .array_utils import (
convert_numpy_to_easydel_array,
matmul_shape,
matmul_single_row,
matmul_2d,
matmul,
T_AXIS_0_2_1, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_AXIS_2_1_0, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_AXIS_1_0_2, # LOW LEVEL AND PARALLELIZED , VECTORIZED
T_2D, # LOW LEVEL AND PARALLELIZED , VECTORIZED
CAT_3D_AXIS_2, # LOW LEVEL AND PARALLELIZED , VECTORIZED
rotate_half, # LOW LEVEL AND PARALLELIZED , VECTORIZED
arange,
)
from .funcs import softmax, silu, relu, leaky_relu, sigmoid, sample_array
from .ed_struct import PointerOperation
Loading