-
Notifications
You must be signed in to change notification settings - Fork 168
EntGAN: A Distributed GAN Framework Baed on Multi-task #337
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 8 commits
a21bc65
16886aa
380718a
0f35c68
6229609
24fa07d
4d4649c
e283aa9
adef7b0
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -28,3 +28,6 @@ __pycache__/ | |
|
|
||
| # go build output | ||
| /_output | ||
|
|
||
| # macOS | ||
| */.DS_Store | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| # Integrate GAN and Self-taught Learning into Sedna Lifelong Learning to Handle Unknown Tasks | ||
|
|
||
| ## Motivation | ||
|
|
||
| In the process of Sedna lifelong learning, there would be a chance to confront unknown tasks, whose data are always heterogeneous small sample. Generate Adversarial Networks(GAN) is the start-of-art generative model and GAN can generate fake data according to the distribution of the real data. Naturally, we try to utilize GAN to handle small sample problem. Self-taught learning is an approach to improve classfication performance using sparse coding to construct higher-level features with the unlabeled data. Hence, we combine GAN and self-taught learning to help Sedna lifelong learning handle unknown tasks. | ||
|
|
||
| ### Goals | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. See the previous discussion in #337 (comment) The story is not yet completed
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The current version does not express the limitation of the current lifelong learning.
|
||
|
|
||
| * Handle unknown tasks | ||
| * Implement of a lightweight GAN to solve small sample problem | ||
| * Utilize self-taught learning to solve heterogeneous problem | ||
|
|
||
| ## Proposal | ||
| We focus on the process of handling unknown tasks. | ||
|
|
||
| The overview is as follows: | ||
|
|
||
|  | ||
|
|
||
| The process is illustrated as below: | ||
| 1. GAN exploits the unknown task sample to generate more fake sample. | ||
| 2. Self-taught learning unit utilize the fake sample and orginal unknown task sample and its label to train a classifier. | ||
| 3. A well trained classifier is output. | ||
|
|
||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What are the targeting scenario and dataset? |
||
| ### GAN Design | ||
| We use the networks design by [TOWARDS FASTER AND STABILIZED GAN TRAINING FOR HIGH-FIDELITY FEW-SHOT IMAGE SYNTHESIS](https://openreview.net/forum?id=1Fqg133qRaI). The design is aimed for small training data and pour computing devices. Therefore, it is perfectly suitable for handling unkwnon tasks of Sedna lifelong learning. The network is shown below [GAN Desin](images/EntGAN%20GAN.png). | ||
|
|
||
|  | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The architecture is needed for the proposal. We see that the GAN is now put in the unseen task processing. It would be better to show the overall architecture to let the user know which scheme it belongs (i.e., lifelong learning), not only the unseen task processing component. See previous comment: #337 (comment)
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not yet resolved |
||
|
|
||
| ### Self-taught Learing Design | ||
| Self-taught learning uses unlabeled data to find the latent feature of data and then makes every labeled data a represention using the latent feature and uses the represention and label corresponding to train classifier. | ||
|
|
||
|  | ||
Uh oh!
There was an error while loading. Please reload this page.