Fri Nov 22 00:13:01 2024
EVENTS
 FREE
SOFTWARE
INSTITUTE

POLITICS
JOBS
MEMBERS'
CORNER

MAILING
LIST

NYLXS Mailing Lists and Archives
NYLXS Members have a lot to say and share but we don't keep many secrets. Join the Hangout Mailing List and say your peice.

DATE 2017-03-01

LEARN

2024-11-22 | 2024-10-22 | 2024-09-22 | 2024-08-22 | 2024-07-22 | 2024-06-22 | 2024-05-22 | 2024-04-22 | 2024-03-22 | 2024-02-22 | 2024-01-22 | 2023-12-22 | 2023-11-22 | 2023-10-22 | 2023-09-22 | 2023-08-22 | 2023-07-22 | 2023-06-22 | 2023-05-22 | 2023-04-22 | 2023-03-22 | 2023-02-22 | 2023-01-22 | 2022-12-22 | 2022-11-22 | 2022-10-22 | 2022-09-22 | 2022-08-22 | 2022-07-22 | 2022-06-22 | 2022-05-22 | 2022-04-22 | 2022-03-22 | 2022-02-22 | 2022-01-22 | 2021-12-22 | 2021-11-22 | 2021-10-22 | 2021-09-22 | 2021-08-22 | 2021-07-22 | 2021-06-22 | 2021-05-22 | 2021-04-22 | 2021-03-22 | 2021-02-22 | 2021-01-22 | 2020-12-22 | 2020-11-22 | 2020-10-22 | 2020-09-22 | 2020-08-22 | 2020-07-22 | 2020-06-22 | 2020-05-22 | 2020-04-22 | 2020-03-22 | 2020-02-22 | 2020-01-22 | 2019-12-22 | 2019-11-22 | 2019-10-22 | 2019-09-22 | 2019-08-22 | 2019-07-22 | 2019-06-22 | 2019-05-22 | 2019-04-22 | 2019-03-22 | 2019-02-22 | 2019-01-22 | 2018-12-22 | 2018-11-22 | 2018-10-22 | 2018-09-22 | 2018-08-22 | 2018-07-22 | 2018-06-22 | 2018-05-22 | 2018-04-22 | 2018-03-22 | 2018-02-22 | 2018-01-22 | 2017-12-22 | 2017-11-22 | 2017-10-22 | 2017-09-22 | 2017-08-22 | 2017-07-22 | 2017-06-22 | 2017-05-22 | 2017-04-22 | 2017-03-22 | 2017-02-22 | 2017-01-22 | 2016-12-22 | 2016-11-22 | 2016-10-22 | 2016-09-22 | 2016-08-22 | 2016-07-22 | 2016-06-22 | 2016-05-22 | 2016-04-22 | 2016-03-22 | 2016-02-22 | 2016-01-22 | 2015-12-22 | 2015-11-22 | 2015-10-22 | 2015-09-22 | 2015-08-22 | 2015-07-22 | 2015-06-22 | 2015-05-22 | 2015-04-22 | 2015-03-22 | 2015-02-22 | 2015-01-22 | 2014-12-22 | 2014-11-22 | 2014-10-22

Key: Value:

Key: Value:

MESSAGE
DATE 2017-03-29
FROM Christopher League
SUBJECT Re: [Learn] This is hard to understand what the logic is here
From learn-bounces-at-nylxs.com Wed Mar 29 13:11:05 2017
Return-Path:
X-Original-To: archive-at-mrbrklyn.com
Delivered-To: archive-at-mrbrklyn.com
Received: from www.mrbrklyn.com (www.mrbrklyn.com [96.57.23.82])
by mrbrklyn.com (Postfix) with ESMTP id C7157161312;
Wed, 29 Mar 2017 13:11:04 -0400 (EDT)
X-Original-To: learn-at-nylxs.com
Delivered-To: learn-at-nylxs.com
Received: from contrapunctus.net (contrapunctus.net [174.136.110.10])
by mrbrklyn.com (Postfix) with ESMTP id 54A47160E77
for ; Wed, 29 Mar 2017 13:11:02 -0400 (EDT)
Received: from localhost (182631b.cst.lightpath.net [24.38.3.27])
by contrapunctus.net (Postfix) with ESMTPSA id 6FE5C27FF9;
Wed, 29 Mar 2017 13:10:56 -0400 (EDT)
From: Christopher League
To: Ruben Safir , "learn\-at-nylxs.com"
In-Reply-To: <631dd24c-6613-b87b-7e47-5dd853e21f24-at-mrbrklyn.com>
References:
<430c32f2-da89-4077-b8a4-a5622fbb6b12.maildroid-at-localhost>
<631dd24c-6613-b87b-7e47-5dd853e21f24-at-mrbrklyn.com>
User-Agent: Notmuch/0.22 (http://notmuchmail.org) Emacs/25.1.1
(x86_64-unknown-linux-gnu)
Date: Wed, 29 Mar 2017 13:10:52 -0400
Message-ID: <87h92cqa8z.fsf-at-contrapunctus.net>
MIME-Version: 1.0
Subject: Re: [Learn] This is hard to understand what the logic is here
X-BeenThere: learn-at-nylxs.com
X-Mailman-Version: 2.1.17
Precedence: list
List-Id:
List-Unsubscribe: ,

List-Archive:
List-Post:
List-Help:
List-Subscribe: ,

Content-Type: multipart/mixed; boundary="===============1355307894=="
Errors-To: learn-bounces-at-nylxs.com
Sender: "Learn"

--===============1355307894==
Content-Type: multipart/signed; boundary="===-=-=";
micalg=pgp-sha256; protocol="application/pgp-signature"

--===-=-=
Content-Type: multipart/mixed; boundary="=-=-="

--=-=-=
Content-Type: multipart/alternative; boundary="==-=-="

--==-=-=
Content-Type: text/plain

Ruben Safir writes:

> If there is always a bias, why is there a while condition, let alone not
> an if condition. What situation were you anticipating?

These logic-gate perceptrons have two inputs, so you should be able to
declare:

vector inputs = {0, 1};

But the weight vector is three values:

vector weights = {0.1, -0.2, 0.3};

The last one is what we call the bias. You can think of it as the weight
attached to a "pseudo" input which is always one. So what the loop does
is to make that "pseudo" input real:

while(input.size() < p.weights.size()) {
input.push_back(1); // Add pseudo input that's always 1
}

Now, since the input.size() matches the weights.size(), we can call the
dot product to do the calculation:

dot_product(p.weights, input)

It's just the "robustness principle": be strict in what you send and
tolerant in what you accept. In the feed_forward function I'm receiving
the input vector. I'm sending it to the dot_product function. I'm
tolerant if the input is too small, and I augment it with ones (pseudo
inputs) until there are enough inputs that the dot_product will work.

BTW, it only has to append the first time through the feed_forward.
Thereafter, if I continue to use the same input vector then the size
will already be correct and the loop body is never executed.

Or if you just want to declare the inputs to have three values, then it
just amounts to a check that you did that, and doesn't have to append:

vector inputs = {0, 1, 1}; // The last 1 is the pseudo input for bias

CL

--==-=-=
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: quoted-printable






1.0, user-scalable=3Dyes">




Ruben Safir ruben-at-mrbrklyn.com=
writes:



If there is always a bias, why is there a while condition, let alone not=
an if condition. What situation were you anticipating?



These logic-gate perceptrons have two inputs, so you should be able to d=
eclare:


vector<float> inputs =3D {0, 1};

But the weight vector is three values:


vector<float> weights =3D {0.1, -0.2, 0.3};

The last one is what we call the bias. You can think of it as the weight=
attached to a =E2=80=9Cpseudo=E2=80=9D input which is always one. So what =
the loop does is to make that =E2=80=9Cpseudo=E2=80=9D input real:


while(input.size() < p.weights.size()) {
input.push_back(1); // Add pseudo input that's always 1
}

Now, since the input.size() matches the weights.size(), we can call the =
dot product to do the calculation:


dot_product(p.weights, input)

It=E2=80=99s just the =E2=80=9Crobustness principle=E2=80=9D: be strict =
in what you send and tolerant in what you accept. In the feed_forward funct=
ion I=E2=80=99m receiving the input vector. I=E2=80=99m sending it to the d=
ot_product function. I=E2=80=99m tolerant if the input is too small, and I =
augment it with ones (pseudo inputs) until there are enough inputs that the=
dot_product will work.


BTW, it only has to append the first time through the feed_forward. Ther=
eafter, if I continue to use the same input vector then the size will alrea=
dy be correct and the loop body is never executed.


Or if you just want to declare the inputs to have three values, then it =
just amounts to a check that you did that, and doesn=E2=80=99t have to appe=
nd:


vector<float> inputs =3D {0, 1, 1};  // The last 1 is the =
pseudo input for bias

CL





--==-=-=--

--=-=-=--

--===-=-=
Content-Type: application/pgp-signature; name="signature.asc"

-----BEGIN PGP SIGNATURE-----

iQEcBAEBCAAGBQJY2+qcAAoJEGuLsz1PMbCLblsH/R4VTOoVMu37ggaIHW9b/8t8
NaSa+gi2gCzqNWLmw+urQyDR/GsX2GryqypGa+hSnXAfC1d9YST3fP/1xSuwZA4U
vTnVCbj/MoYeUnY10O4ShzlTrZnIDOXHNTRpUT0PnMP3vGoj+gdPyCDYMwb+/86j
EgQLVes3VnbIlsRfYBIsPZ3tDSlx+NtSyYAldLqsYS3wuQYCqR1ltlxQKCvNGP/+
uiwexxolJoqH+o+X64J16p/4OD4nWfJ1md3d8q3+OZjVDOyPkp5D0/RRYPXKQJ+q
vwt+o+hlVbdsUgy6t3MWn+LrM8i1oLvxRP8UXso4KQhjYqMed5mRAjjTgTPUXkM=
=3R+c
-----END PGP SIGNATURE-----
--===-=-=--

--===============1355307894==
Content-Type: text/plain; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

_______________________________________________
Learn mailing list
Learn-at-nylxs.com
http://lists.mrbrklyn.com/mailman/listinfo/learn

--===============1355307894==--

  1. 2017-03-02 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] neutal networks and pacman
  2. 2017-03-02 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Ultrametric networks: a new tool for phylogenetic analysis
  3. 2017-03-05 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] cost in evolution
  4. 2017-03-09 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] CatScan fossil files
  5. 2017-03-10 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] neural Networks and Quantum Mechanics
  6. 2017-03-12 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] I just found a GREAT video on partial derivatives
  7. 2017-03-13 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] Contact DOJ and thell them to blow it out their ass
  8. 2017-03-14 Ruben Safir <ruben-at-mrbrklyn.com> Re: [Learn] CatScan fossil files
  9. 2017-03-14 Ramon Nagesan <ramon.nagesan-at-gmail.com> Re: [Learn] CatScan fossil files
  10. 2017-03-16 Ruben Safir <ruben-at-mrbrklyn.com> Re: [Learn] is this up
  11. 2017-03-16 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] hang out is down
  12. 2017-03-16 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] is this working
  13. 2017-03-16 Charlie Gonzalez <itcharlie-at-gmail.com> Subject: [Learn] Registration for The Perl Conference 2017 is now open!!
  14. 2017-03-16 From: "soledad.esteban" <soledad.esteban-at-icp.cat> Subject: [Learn] [dinosaur] Advanced course Geometric Morphometrics in R,
  15. 2017-03-17 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Fwd: [dinosaur] Advanced course Geometric Morphometrics in
  16. 2017-03-17 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] good news !
  17. 2017-03-18 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] circles
  18. 2017-03-20 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Alice
  19. 2017-03-20 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] hough transform - Lecture 9
  20. 2017-03-21 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Anyone understand this well
  21. 2017-03-21 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Fwd: [dinosaur] Digital mapping of dinosaurian tracksites
  22. 2017-03-22 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] ODBASE 2017 - The 16th International Conference on
  23. 2017-03-24 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Decision Tree
  24. 2017-03-24 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Genetic Modification with decent
  25. 2017-03-27 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] Fwd: Re: hough transform - Lecture 9
  26. 2017-03-27 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] MOOCS
  27. 2017-03-27 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Peter Novig learning and on line teaching
  28. 2017-03-28 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] computations
  29. 2017-03-29 Christopher League <league-at-contrapunctus.net> Re: [Learn] This is hard to understand what the logic is here
  30. 2017-03-29 Ruben Safir <ruben-at-mrbrklyn.com> Re: [Learn] This is hard to understand what the logic is here
  31. 2017-03-29 Christopher League <league-at-contrapunctus.net> Re: [Learn] This is hard to understand what the logic is here
  32. 2017-03-29 Ruben Safir <mrbrklyn-at-panix.com> Re: [Learn] This is hard to understand what the logic is here
  33. 2017-03-29 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] perseptors
  34. 2017-03-29 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] This is hard to understand what the logic is here
  35. 2017-03-29 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] This is hard to understand what the logic is here
  36. 2017-03-30 Ruben Safir <mrbrklyn-at-panix.com> Re: [Learn] c arrays
  37. 2017-03-30 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] c arrays
  38. 2017-03-30 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] random weights
  39. 2017-03-30 Ruben Safir <mrbrklyn-at-panix.com> Subject: [Learn] randomize with commentary
  40. 2017-03-31 Ruben Safir <ruben-at-mrbrklyn.com> Subject: [Learn] Computational Paleo

NYLXS are Do'ers and the first step of Doing is Joining! Join NYLXS and make a difference in your community today!