# AidLearning-FrameWork **Repository Path**: zhangzsw/AidLearning-FrameWork ## Basic Information - **Project Name**: AidLearning-FrameWork - **Description**: Build Linux running on the Android with GUI ,Python and AI support. Python+linux+Android+AI 4in1 environments. - **Primary Language**: Python - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 1 - **Forks**: 1 - **Created**: 2019-07-27 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README  **AidLearning** is a Linux system running on the Android with GUI, Deep-Learning and Python Visual Programming support . The [AidLearning](http://www.aidlearning.net) framework which have the Linux+Anroid+AI+Python 4in1 environments Developed and Maintained by several students from [Cas University](http://english.cas.cn) and [Yale University](https://www.yale.edu). ### Table of Contents - introduce - Dependencies - Installation - Support Powerfully - Touch and Drag Style Programming - SSH (pc connected) - Aid_code IDE of python - Examples inside - References # Introduce [](license.md) / thank for qidiso provide the [中文说明](https://www.jianshu.com/p/f6ec13ece792) **AidLearning** builds a Linux system on Android mobile phone, and supports **GUI**, **Python** and **AI** programming.This means that when it is installed, your Android phone has a Linux system in which you can run Gui programs of python and AI. Now supports a list of Top Machine Learning Frameworks for Deep Learning: **Caffe, Tensorflow, Mxnet(and Gluoncv), ncnn, Keras, Pytorch, Opencv,Scipy** powerfully build-in! More Than this,we provide an AI coding develop tool named **Aid_code**. It can provide you a visual AI programming IDE by using Python from **zero** on our framework! It means that when it is installed, your Android phone owned a Linux system with GUI which can write and run AI program in it as same as in pc. In addition,Aid Learning can provide a new visual programming experience of **_touch-and-drag_** by using Python on our framework. At the same time, **AidLearning** provides wifi-based mapping projection technology, which can project the code of mobile phone to PC and interact with **SSH** remote commands and web online. It can also be projected to TV and projector for large screen display. In short, **AidLearning** has created a 4in1 and touch-and-drag platform for rapid development and learning of **Android+Linux+AI+Python**. It can not only use mobile phones for fragmented programming, but also make full use of the development advantages of the two main operating systems (**Android+Linux**) and the perfect AI terminal advantages of mobile phones. With this advantage, **AidLearning** can build a perfect learning ecosphere of programming education.  ## Dependencies All you need is an Android devices (phone ,tablet or arm board) that supports the CPU of **Arm64(aarch64)**. The Android version requires more than 6.0. If you think the parameters are not clear enough, I would like to say that most of the mainstream mobile phones support it, such as _Samsung, Huawei, MI, OPPO, VIVO, nubia_ etc. In addition, the requirement of storage space is a little big. It is suggested that there should be **2G** free storage space. ## Installation To install **AidLearing**, Simply download an App (apk file) and install it on your mobile device. download newest version at : [ Download v0.74 now!](http://www.aidlearning.net/downloads/aidlux-07-04.apk) Other version at:[https://github.com/aidlearning/AidLearning-FrameWork/releases](https://github.com/aidlearning/AidLearning-FrameWork/releases) The APP (apk) is only 6M,when you install the apk and launch,the apk will auto download the dependence of the linux and examples of codes . all is about 1G size to download .So it's recommended that you install it _in a wifi environment_. **Important reminder:** Click the setting icon after entering the desktop, the dialog box for the camera permission will pop up, please click agree, if you want to use the built-in examples. **New release**: [ Download v0.75 now!](http://www.aidlearning.net/downloads/aidlux-07-18.apk) this is smaller size version,you can choose the ai framework for python 2/3(caffe/mxnet/tf/torch/keras) to install when u finished the install and launch the gui. the version all is about 650M dependence to download. so,the version can save your disk. ## Support --- Support AI Framework: * [Caffe]https://github.com/BVLC/caffe * [Tensorflow]https://github.com/tensorflow/tensorflow * [Mxnet]https://github.com/apache/incubator-mxnet * [Keras]https://github.com/keras-team/keras * [ncnn]https://github.com/Tencent/ncnn * [pytorch]https://github.com/pytorch/pytorch * [opencv]https://github.com/opencv/opencv --- Support Python2.7 and Python3.6.4: | AidLearning | Python2.7 | Python3.6| | --------- | -------- | -----: | | caffe | ✓1.0.0 | ✓ 1.0.0| | mxnet | ✓1.0.0 | ✓1.5.0 | | tensorflow | ✓1.10.0 | ✓1.5.0 | | Gluoncv | ✗ | ✓ 0.40| | Keras | ✓2.2.4 | ✓2.2.4 | | Pytorch | ✗ | ✓1.1.0 | | Opencv(cv2) | ✓2.4.9 | ✓3.4.6 | | Scipy | ✓0.18.1 | ✓1.3.0 | | Numpy | ✓1.14.5 | ✓1.16.3 |  ## Touch_and_Drag_Programming Now you can easily customize your GUI with touch and drag using wizard!Wizard will produce the code automatic like this: ``` class MyApp(App): def __init__(self, *args): super(MyApp, self).__init__(*args) def main(self): container = gui.VBox(width=120, height=100) self.lbl = gui.Label('Hello world!') self.bt = gui.Button('Press me!') # setting the listener for the onclick event of the button self.bt.onclick.do(self.on_button_pressed) # appending a widget to another, the first argument is a string key container.append(self.lbl) container.append(self.bt) # returning the root widget return container #listener function def on_button_pressed(self, widget): self.lbl.set_text('Button pressed!') self.bt.set_text('Hi!') ``` 