10:03:21 From James Newton to Everyone: The thing about traveling by air is the other passengers. They can really make life hell.
10:05:57 From Steven Kaehler to Everyone: Yah. Those other passengers sure make things crowded!
10:09:05 From John Jennings to Everyone: RoboKame’s 2nd run of Popcan Challenge https://www.youtube.com/watch?v=FRkc9BOGA3o
This YT video is the complete recording of Popcan Challenge and I added a comment with timestamps to each run (I think) and the wrong name.
10:09:30 From James Newton to Everyone: LOL… sorry ,JJ, posted before I saw your link.
10:10:59 From John Jennings to Everyone: Oops, RoboKame’s 2nd run of Popcan Challenge https://youtu.be/UAPeGdxWZig
10:11:40 From John Jennings to Everyone:
Facebook Group for Popcan Challenge https://www.facebook.com/groups/PopCanChallenge
10:13:26 From James Newton to Everyone:
10:14:26 From James Newton to Everyone:
http://techref.massmind.org/techref/method/ai/LinearRegresion.htm
10:23:10 From Lloyd Moore to Everyone:
10:26:56 From James Newton to Everyone: I’d love to hear “lessons learned” from the Pop Can Challenge competitors.
10:28:26 From John Jennings to Everyone: A really handy tool for Pop Can Challengers: WiFi Analyzer app Android
https://play.google.com/store/apps/details?id=abdelrahman.wifianalyzerpro&hl=en_US&pli=1
10:40:30 From Nathan Kohagen to Everyone: https://en.wikipedia.org/wiki/Smith_predictor
10:41:46 From James Newton to Everyone: Thanks for that link Nathan!
10:43:49 From Nathan Kohagen to Everyone: I saw it used in a proprietary control loop that I can’t share the entire context because of work reasons.
10:44:39 From Nathan Kohagen to Everyone: That’s used for compensating for delay in loops.
10:46:25 From Nathan Kohagen to Everyone:
Kalman Filters can be used to compensate for erroneous data coming back from sensors. The different types of Kalman Filters dynamically switch back and forth between the predictive model of your system and the actual sensor data. They are used a lot for sensor fusion in IMUs / 9-DOF sensors.
10:47:42 From James Newton to Everyone: KF’s are great.
10:47:59 From Chas Ihler to Everyone: Reacted to “Kalman Filters can b…” with ❤️
10:48:32 From James Newton to Everyone: Reacted to “Kalman Filters can…” with ❤️
10:48:40 From James Newton to Everyone: Reacted to “https://en.wikiped…” with ❤️
10:50:29 From Nathan Kohagen to Everyone: You’re describing TDD
https://en.wikipedia.org/wiki/Test-driven_development
10:50:32 From Terry James to Everyone:
Replying to “KF’s are great.” KF? (Kalman Filters).
10:50:51 From Chas Ihler to Everyone: I have to drop, have a great weekend all!
10:51:49 From Terry James to Everyone: Replying to “I have to drop, have…” CYA
10:52:33 From Nathan Kohagen to Everyone: KF:= Kalman Filter, EKF:= Extended Kalman Filter
10:52:36 From James Newton to Everyone: Replying to “KF’s are great.”
Kalman Filters
10:54:04 From James Newton to Everyone: KFC Kalman Filter[ed] Chicken
10:54:22 From Colin Leuthold to Everyone: Reacted to “KFC Kalman Filter[ed] Ch…” with 😂
10:57:04 From Terry James to Everyone: Search on KFC “The Chizza”
10:57:20 From Terry James to Everyone:
11:04:57 From Nathan Kohagen to Everyone:
https://en.wikipedia.org/wiki/DO-178B
https://en.wikipedia.org/wiki/DO-178C
11:13:05 From Bob to Everyone:
James, I found my neural network code. It is in C++ under the Visual Studio environment, and uses a couple of external packages: math kernel library (MKL), Intel performance primitives (IPP), and rapid JSON. It isn’t well commented code either. If you still want it, I can put it on DropBox.
11:14:23 From James Newton to Everyone:
Replying to “James, I found my…”
Yeah, I’d love it! I’d rather see it on github, but if you are only willing to share it privately, please include an email or something so I can ask for permission to share it under some conditions if I find a way to make use of it in a class?
11:15:39 From James Newton to Everyone: Is this the right slide?
11:16:00 From Lloyd Moore to Everyone: Looks okay from my side.
11:16:18 From Bob to Everyone:
If you are really looking for a method of doing polynomial curve fitting, the algebraic method is actually faster and more accurate. Neural networks just approximate the algebraic approach.
11:16:19 From Lloyd Moore to Everyone: Right now he is at slide 13 of 75 – are yours updating?
11:16:41 From James Newton to Everyone: I’m still seeing the Vine robots overview, and Charlie is talking about very different things.
11:17:09 From Lloyd Moore to Everyone: Try disconnecting and reconnecting – sounds like your screen share may have frozen….
11:17:32 From Lloyd Moore to Everyone: He’s now at slide 14 if that helps.
11:17:34 From Terry James to Everyone: Slide just changed for me.
11:18:02 From James Newton to Everyone:
Ok, that’s better.
11:18:08 From Lloyd Moore to Everyone: Reacted to “Ok, that’s better.” with 👍
11:18:36 From Thurman Gillespy to Everyone: Dune Sandworm!
11:24:42 From Nathan Kohagen to Everyone: Reacted to “Dune Sandworm!” with 👍
11:56:44 From Bob to Everyone:
Here is the DropBox link for the neural network code. My email is robertphiggins@comcast.net. I suspect you will need some explanation to get it operating and how it works.
https://www.dropbox.com/scl/fo/wblgjpviwoe5185lrkub6/h?dl=0&rlkey=k7epg821s4dmzfiid6f80096t
12:11:49 From Scott to Everyone: Did I hear you say “$8” for this motor? The motors at https://www.pololu.com/search/compare/60 look similar, but are $20.
12:12:18 From Nathan Kohagen to Everyone: Thank you Charlie!
12:13:45 From Terry James to Everyone: How slow do you want the robot to move?
12:14:33 From Terry James to Everyone: The air bladder robot should be able to move a lot faster
12:15:41 From Terry James to Everyone: Like exploring rubble of a collapsed building
12:15:58 From Terry James to Everyone: Search and Rescue robot
12:27:05 From Charlie Xiao to Everyone: Scott, that looks correct
12:27:27 From Charlie Xiao to Everyone: You can also find similar but cheaper motors on Amazon.
12:27:39 From Terry James to Everyone:
Thank you again.