Teburin Abubuwan Ciki[Boye][Nuna]
Kwakwalwa tana kwatankwacin cibiyoyin sadarwa na jijiyoyi. Wannan shi ne kwatankwacin da aka saba amfani da shi don taimakawa wani sabon batun don fahimtar ra'ayoyin da ke bayan koyan na'ura da cibiyoyin sadarwa na wucin gadi.
Saboda akwai nau'ikan lissafin lissafi da ƙididdiga da yawa da ke gudana a bayan fage, ayyana waɗannan cibiyoyin sadarwa a matsayin aikin lissafi hanya ce ta ci gaba.
Wannan ga mutanen da suke da sha'awar koyon inji kuma suna son ganin yadda ake rubuta lambar hanyar sadarwa ta Python neural.
A cikin wannan labarin, za mu nuna yadda ake gina cibiyar sadarwa mai zurfi mai zurfi (DNN) daga karce a ciki Python 3.
Bayanin Tsarin Fayil don Lambobin Neural Network ɗin mu na Python
Za a sami fayiloli guda uku da aka ƙirƙira a nan. Na farko shine fayil ɗin nn.py mai sauƙi, wanda za'a tattauna a cikin "Shigar da Ayyukan Taimako" da "Gina Cibiyar Sadarwar Jijiya daga Scratch."
Za mu kuma sami fayil mai suna mnist loader.py don loda bayanan gwajin, kamar yadda aka bayyana a cikin "Loading Data MNIST."
A ƙarshe, za mu sami fayil mai suna test.py wanda za a ƙaddamar a cikin tashar don gwada hanyar sadarwar mu.
An kwatanta wannan fayil daki-daki a cikin "Gwajin Gudun."
Installation
Dole ne a zazzage ɗakin karatu na NumPy Python don bin wannan koyawa. Kuna iya cim ma wannan ta amfani da umarni mai zuwa akan tashar tashar:
Ana shigo da Modules da kafa aikin Taimako
Laburaren biyu kawai da muke buƙata sune bazuwar da NumPy, waɗanda za mu shigo da su nan da nan. Don ma'aunin farko na cibiyar sadarwar mu, za mu jujjuya su ta amfani da ɗakin karatu bazuwar.
Don hanzarta lissafin mu, za mu yi amfani da NumPy ko np (ta al'ada, galibi ana shigo da shi azaman np). Za a yi ayyukan mataimakan mu biyu bayan shigo da mu. Ayyukan sigmoid guda biyu: ɗaya da sigmoid prime.
Sake dawo da dabaru zai rarraba bayanai ta amfani da aikin sigmoid, yayin da yada baya zai lissafta delta ko gradient ta amfani da aikin sigmoid firamare.
Ƙirƙirar Class Class
Gina cikakkiyar hanyar sadarwa na jijiyoyi shine kawai abin da ake mayar da hankali kan wannan sashe. Ajin cibiyar sadarwa zai ƙunshi duk ayyukan da ke zuwa bayan. Za a ƙirƙiri aikin Object() {[native code]} a farkon ajin cibiyar sadarwar mu.
Hujja ɗaya, girma, ana buƙata ta aikin Object() {[lambar ƙasa]}. Matsakaicin masu girma dabam tarin ƙididdiga ne wanda ke wakiltar adadin nodes ɗin shigarwa da ke cikin kowane Layer na cibiyar sadarwar mu.
Mun fara kadarori guda huɗu a cikin hanyar mu ta __init__. Ana amfani da masu canjin shigarwa, masu girma dabam, don saita jeri na masu girma dabam da adadin yadudduka, lambobi, bi da bi.
Mataki na farko shine sanya son zuciya ta farko ta hanyar sadarwarmu ga kowane Layer da ke bin layin shigarwa.
A ƙarshe, kowace hanyar haɗi tsakanin abubuwan shigarwa da matakan fitarwa suna da ma'aunin nauyin sa ba da gangan ba. Np.Random.Randn() yana ba da samfurin bazuwar da aka zana daga daidaitattun rarraba don mahallin.
Ayyukan Gabatarwa
A cikin hanyar sadarwa na jijiyoyi, ana aika bayanai gaba ta aikin ciyarwa. Hujja ɗaya, a, mai nuna alamar kunnawa na yanzu, za a buƙaci wannan aikin.
Wannan aikin yana ƙididdige abubuwan kunnawa a kowane Layer ta hanyar ƙididdige duk son zuciya da ma'aunin nauyi a cikin hanyar sadarwa. Amsar da aka bayar ita ce tsinkaya, wanda shine kunnawar Layer na ƙarshe.
Karamin-tsalle na Gradient
Horse ajin hanyar sadarwar mu shine Dindindin Saukowa. A cikin wannan sigar, muna amfani da ƙaramin batch (stochastic) zuriyar gradient, gyare-gyaren bambancin zuriyar gradient.
Wannan yana nuna cewa za a yi amfani da ƙaramin adadin bayanai don sabunta ƙirar mu. Hudu da ake buƙata kuma hujja na zaɓi ɗaya an wuce zuwa wannan hanyar. Matsaloli huɗu da ake buƙata sune saitin bayanan horo, adadin lokutan zamani, girman ƙaramin batches, da ƙimar koyo (eta).
Ana samun bayanan gwaji akan buƙata. Za mu samar da bayanan gwaji idan muka ƙididdige wannan hanyar sadarwar. Adadin samfurori a cikin wannan aikin an fara saita shi zuwa tsawon jerin da zarar bayanan horo ya canza zuwa nau'in jeri.
Hakanan muna amfani da wannan tsari don gwada bayanan da aka bayar a ciki. Wannan saboda maimakon a mayar mana da su azaman lists, ainihin zips ne na lists. Lokacin da muka loda samfuran bayanan MNIST daga baya, za mu ƙara koyo game da wannan.
Idan za mu iya tabbatar da cewa mun samar da nau'ikan bayanai guda biyu azaman jeri, to wannan nau'in simintin ba lallai ba ne.
Da zarar mun sami bayanai, za mu wuce zamanin horo a cikin madauki. Lokacin horo zagaye ɗaya ne kawai na horon hanyar sadarwar jijiyoyi. Mun fara jujjuya bayanai a kowane zamani don tabbatar da bazuwar kafin yin jerin ƙananan batches.
Za'a kira aikin ƙaramin ƙaramin aikin sabuntawa, wanda aka tattauna a ƙasa, don kowane ƙaramin tsari. Hakanan za'a dawo da daidaiton gwajin idan akwai bayanan gwajin.
Ayyukan taimako mai ƙima
Bari mu fara haɓaka aikin mataimaki da ake kira ƙimar kuɗi da farko kafin mu ƙirƙiri ainihin lambar yaɗa baya. Idan muka yi kuskure a cikin abin da muke fitarwa, aikin haɓakar farashi zai nuna shi.
Yana buƙatar bayanai guda biyu: tsararrun kunnawa fitarwa da y-coordinates na ƙimar fitarwa da ake tsammani.
Aikin baya-baya
Nau'in kunnawa na yanzu, kunnawa, da duk wani nau'in kunnawa, kunnawa, da z-vectors, zs, dole ne a kiyaye su. Layer da ake kira Layer shigarwa yana kunna farko.
Za mu yi la'akari da kowane son zuciya da nauyi bayan sanya su. Kowane madauki ya ƙunshi ƙididdige vector z azaman samfurin ɗigo na ma'auni da kunnawa, ƙara shi zuwa jerin zs, sake ƙididdige kunnawa, da ƙara sabuntawar kunnawa cikin jerin kunnawa.
A ƙarshe, lissafi. delta, wanda yayi daidai da kuskure daga layin da ya gabata wanda aka ninka ta sigmoid prime na farkon kashi na zs vectors, ana lissafta kafin mu fara wucewar mu ta baya.
Ƙarshen Layer na nabla b an saita shi ya zama delta, kuma ƙarshen nabla w an saita shi ya zama ɗigon samfurin delta da na biyu-zuwa-ƙarshe na kunnawa (an canza shi don mu iya yin lissafi) .
Muna ci gaba kamar yadda ya gabata, farawa tare da Layer na biyu kuma mun ƙare tare da na ƙarshe, kuma muna maimaita tsarin bayan kammala waɗannan matakan ƙarshe. Ana mayar da nablas a matsayin tuple.
Ana ɗaukaka ƙarami-tsalle mai sauƙi
Hanyar mu ta SGD (zuriyar gradient) daga baya tana haɗa ƙaramin sabuntawa. Tunda ana amfani dashi a cikin SGD amma kuma yana buƙatar backprop, na yi muhawara inda zan saka wannan aikin.
A ƙarshe, na yanke shawarar buga shi a nan. Yana farawa ta hanyar samar da 0 vectors na son zuciya 'da nablas' ma'auni, kamar yadda aikin mu na baya ya yi.
Yana buƙatar ƙaramin batch da ƙimar koyan eta azaman abubuwan shigarsa guda biyu. A cikin ƙaramin batch, sai mu yi amfani da aikin backprop don samun delta na kowane tsararrun nabla don kowane shigarwar, x, da fitarwa, y. Ana sabunta lissafin nabla tare da waɗannan deltas.
A ƙarshe, muna amfani da ƙimar koyo da nablas don sabunta ma'aunin cibiyar sadarwa da son zuciya. Ana sabunta kowace ƙima zuwa ƙima ta baya-bayan nan, ƙasa da ƙimar koyo, ana ninka ta da ƙaramin ƙarami, sannan a ƙara zuwa ƙimar nabla.
Auna aikin
Aikin tantancewa shine na ƙarshe wanda muke buƙatar rubutawa. Bayanan gwaji shine kawai shigarwar wannan aikin. A cikin wannan aikin, kawai muna kwatanta abubuwan da ke fitowa daga cibiyar sadarwa tare da sakamakon da ake tsammani, y. Ta ciyar da shigarwar, x, gaba, ana ƙayyade abubuwan da ke cikin cibiyar sadarwa.
Cikakken Code
Lokacin da muka haɗa duk lambar, wannan shine yadda yake bayyana.
Gwajin Neural Network
Ana loda bayanan MNIST
The Bayanan Bayani na MNIST yana cikin tsarin .pkl.gz, wanda za mu buɗe ta amfani da GZIP kuma za mu ɗora da pickle. Bari mu rubuta hanya mai sauri don loda wannan bayanai a matsayin tuple na girma uku, zuwa horo, inganci, da bayanan gwaji.
Don sauƙaƙe sarrafa bayanan mu, za mu rubuta wani aiki don ɓoye y cikin jerin abubuwa 10. Tsare-tsaren za su kasance duka 0s ban da 1 wanda yayi daidai da madaidaicin lambobi na hoton.
Za mu yi amfani da ainihin bayanan lodi da kuma hanyar ɓoye mai zafi guda ɗaya don loda bayanan mu cikin sigar da za a iya karantawa. Za a rubuta wani aiki wanda zai canza ƙimar x ɗin mu zuwa jerin girman 784, wanda ya yi daidai da pixels 784 na hoton, da ƙimar y ɗin mu zuwa nau'in vector mai zafi guda ɗaya.
Sa'an nan kuma za mu haɗu da ƙimar x da y kamar yadda fihirisa ɗaya ya dace da ɗayan. Wannan ya shafi horo, tabbatarwa, da saitin bayanan gwaji. Sai mu dawo da bayanan da aka canza.
Gwajin Gudu
Za mu yi sabon fayil mai suna "mnist loader" wanda zai shigo da duka hanyar sadarwa ta jijiyoyi da muka kafa a baya (mai sauƙi nn) da kuma na'urar saita bayanan MNIST kafin mu fara gwaji.
A cikin wannan fayil ɗin, duk abin da muke buƙatar yi shine shigo da bayanan, gina hanyar sadarwa tare da girman shigarwar shigarwa na 784 da girman abin fitarwa na 10, gudanar da aikin SGD na cibiyar sadarwa akan bayanan horo, sannan gwada shi ta amfani da bayanan gwaji.
Ka tuna cewa don jerin abubuwan shigarwar mu, ba shi da bambanci abin da kowane lambobi ke tsakanin 784 da 10. Za mu iya canza sauran yadudduka ta yadda muke so; kawai shigarwar shigarwa da girman fitarwa an gyara su.
Yadudduka uku ba lallai ba ne; za mu iya amfani da hudu, biyar, ko ma biyu kawai. Yi nishaɗin gwaji tare da shi.
Kammalawa
Anan, ta amfani da Python 3, muna ƙirƙirar hanyar sadarwa ta jijiyoyi daga karce. Tare da babban matakin lissafi, mun kuma tattauna ƙayyadaddun aiwatarwa.
Mun fara da aiwatar da ayyukan taimako. Domin neurons suyi aiki, ayyukan sigmoid da sigmoid na farko suna da mahimmanci. Sa'an nan kuma muka aiwatar da aikin ciyarwa, wanda shine ainihin tsari don ciyar da bayanai a cikin hanyar sadarwa na jijiyoyi.
Bayan haka, mun ƙirƙiri aikin zuriyar gradient a Python, injin da ke tafiyar da hanyar sadarwar mu. Domin nemo "karamin gida" da inganta ma'auninsu da son zuciya, hanyar sadarwar mu ta jijiyoyi tana amfani da zuriyar gradient. Mun ƙirƙiri aikin watsa baya ta amfani da saukowa gradient.
Ta hanyar isar da sabuntawa lokacin da abubuwan da aka fitar ba su dace da takalmin da suka dace ba, wannan aikin yana baiwa cibiyar sadarwa ta jijiya damar “koyi.”
A ƙarshe, mun sanya sabon Python ɗin mu neural network zuwa gwajin ta amfani da saitin bayanan MNIST. Komai yana aiki lafiya.
Kiyi Hakuri!
Leave a Reply