🤖Have you ever tried Chat.M5Stack.com before asking??😎
    M5Stack Community
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Register
    • Login

    V-Training Service is active now??

    Scheduled Pinned Locked Moved M5Stick V
    v-training
    12 Posts 7 Posters 24.7k Views 1 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • N Offline
      Nabeshin
      last edited by

      Thank you.
      This is not serious at all.
      I will try it again in next year.

      Have nice holidays!!!

      1 Reply Last reply Reply Quote 0
      • N Offline
        Nabeshin
        last edited by Nabeshin

        @Zontex
        Now I can upload files. But Training always fails with this error.

        CONTENT: Unexpected Error Found During Triaining, A target array with shape (128, 6) was passed for an output of shape (None, 7) while using as loss categorical_crossentropy. This loss expects targets to have the same shape as the output.

        I uploaded following data.

        • 6 classes training data
        • Each class has 90 files(85 files in train folder. 5 + 1 files in each valid folder)
          • At first, using the result folder created by UnitV_boot_v1220.py, I got Error: Lake of Enough Valid Dataset, Only 30 pictures found, but you need 35 in total.
          • Maybe Lake of is typo of Lack of I think.
          • So I moved 1 file manually from train folder to valid folder for each class.(Then I made 5 + 1 files in each valid folder.(Total 36 in valid folder)
          • Then that goes fine.
        • But now I got that error about categorical_crossentropy .
          • I tried several cases and got same kind of error.
          • For 6 class case: A target array with shape (128, 6) was passed for an output of shape (None, 7)
          • For 4 class case: A target array with shape (64, 4) was passed for an output of shape (None, 5)

        I'm not sure but I think it's because,

        • At model.add(Dense(X)) script, X should be the same number of uploaded class folders of each request.
        • But maybe it is over counted from exact one in your script.
          • https://github.com/keras-team/keras/issues/3237

        Do you have any idea for fix it?

        Regards.

        1 Reply Last reply Reply Quote 0
        • N Offline
          Nabeshin
          last edited by

          @Zontex
          Hi, I tried again now but It seems like still having problems...

          Can I build same environment in my computer?
          What should I install for this machine learning?

          If you can point the GitHub URL, Please let me help.

          Regards.

          1 Reply Last reply Reply Quote 0
          • N Offline
            Nabeshin
            last edited by Nabeshin

            Hi, Now I got one solution by myself.
            We can build a private environment of K210 machine learning on Goggle colaboratory.
            Of course we can share.

            I wrote an article here.
            This can be used as an another way of V-Training for M5StickV and M5UnitV.

            Qiita

            • https://qiita.com/Nabeshin/items/d526a94b8f3be85bbb05

            Goggle colaboratory

            • https://colab.research.google.com/drive/1UdQV5nQMQy1ckJP0qdS3kS7sW6sEQyzs#scrollTo=prOt_VofnYCo

            Thank you.

            1 Reply Last reply Reply Quote 2
            • Y Offline
              yossy
              last edited by

              @Zontex
              Hi, I am now facing the same issue now.

              My task (No .10: 2d37a16258bdaee3) does not finish... and No.1 which was uploaded in 5/7/2021 seems not finished yet. so is it possible to know the current situation? thank you in advance.

              m5stackM 1 Reply Last reply Reply Quote 0
              • m5stackM Offline
                m5stack @yossy
                last edited by

                @yossy sorry about that.. seems the server has some problem. we are working on it now.

                Y 1 Reply Last reply Reply Quote 1
                • Y Offline
                  yossy @m5stack
                  last edited by

                  @m5stack Thank you for your notice. I could check that the system is working correctly now. Thank you!

                  1 Reply Last reply Reply Quote 0
                  • K Offline
                    kenkenissocool
                    last edited by

                    @m5stack
                    Hi, I am now facing the same issue now.

                    My task (No .2: 288ab05e8953e901) does not finish... and No.1 which was uploaded in 3/12/2022 seems not finished yet. so is it possible to know the current situation? thank you in advance.

                    1 Reply Last reply Reply Quote 0
                    • C Offline
                      class14
                      last edited by

                      Do I have the same problem if mine says waiting since yesterday?

                      1 Reply Last reply Reply Quote 0
                      • H Offline
                        Harvestman
                        last edited by

                        I'm also waiting on this. It looks like there are items in the queue since 12/2/2022.

                        1 Reply Last reply Reply Quote 0

                        Hello! It looks like you're interested in this conversation, but you don't have an account yet.

                        Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

                        With your input, this post could be even better 💗

                        Register Login
                        • First post
                          Last post