Microsoft Home Server review

Here is a question for you:  when was the last time you backed up your home’s digital files? Maybe never? Bad answer.

Microsoft has been working on a solution, and it went into its final production throes this past week. The product is called Windows Home Server, and it is a stripped-down version of its Windows Server 2003 that normally costs a thousand bucks or so. For the time being, you can download a timed-version (it will work until December) freely from this link. You do need to sign up and answer a few questions to join the Connect service, which also has other pre-release software from Microsoft.

You need to install the software on a new machine: it will wipe your disk clean and boot up automatically with the Home Server running. The software is designed to run “headless” which means that you don’t need to attach a monitor or a keyboard, once you get beyond certain basics that I will talk about in a moment. It will install the operating system, split your hard disk into two partitions (one for system files, one for data), and set up a bunch of shared drives for pictures, videos, files, and so forth. Think of this as layering a simple set of controls on top of the standard Windows server platform.

To access these shares, you will need to run another piece of software called Home Server Connector Software from each computer to set up the network connection. There are basically two different levels of access – “remote control” for the administrators that gives them access to the server control console, and ordinary file and printer shares for everyone else.

I tried it out on my home office network to mixed results. I liked a few things:

First, getting to the reason for this column, it is very easy to backup your PCs with this product, provided you have a big enough disk on the server’s PC. You can choose what you want to backup, and it automagically does it in the middle of the night, when traffic is lightest (and presumably your PC that is to be backed up is still powered on). You can set up a different schedule if you are pickier.

Second, Home Server can also automatically synchronize its shared folders with ones on your local PC – that is a neat trick and something you might consider for say sharing your pictures or videos across the network, and something that has been standard with the Windows server line for some time.

Finally, you can control the server from outside your home, if it can figure out how to open up your home gateway ports.  It uses UPnP to do this. Sadly, my 2Wire DSL gateway doesn’t support this (it doesn’t support a lot of other things, but that discussion will have to wait for another day). It would be nice if there were another alternative to UPnP, but there isn’t.

Here are some things that I didn’t like about the software.

First, you initially need complex passwords to set the darn thing up, meaning something with seven characters, upper and lower case and numbers too. That seems a bit onerous for the average home network. This can be loosened up once you get the first user going.

Second, when the install was done, it didn’t recognize the Intel network adapter that was in a fairly recent Dell. Once I installed the right driver, I was good to go. Third, despite its headless installation, you will still need to be sitting in front of the server to set up a shared printer. Next, the only clients for this server are Windows XP with Service Pack 2 or Vista – if you have got anything older on your home network, and chances are good you do – then don’t even bother with the product.

Is this a good deal? It is hard to tell until Microsoft sets pricing. There is still talk that it will be available both as a bundled piece of hardware from the usual suspects and as a software download, but we’ll see.

If it does come as low-cost software and you have an older PC and can upgrade the storage, it might be worth it. But if you have older Windows and Macs, then no: you are better off buying either a Mac mini or a network-attached storage box and saving yourself the trouble.

MediaGate MG-350HD: An inexpensive networked video server

The growing antagonism between Google/You Tube and the creators who “involuntarily supply” their video content has shown that the PC is becoming the place to go to watch videos. So wouldn’t it be nice if you could stash all of your huge video and music files someplace other than your own computer’s hard drive? And if such a place could be easily connected to your living room TV and stereo system, so you could view the videos and listen to music without having to integrate a PC into your living room stack of gear? And wouldn’t it be nice if could you use a wireless connection to move these files from your PCs too, since you can’t or won’t wire your living room with Ethernet?

These aren’t empty questions, but the idea behind the $275 MediaGate MG-350HD. It is the size of a hardback book with lots of cables and connectors to hook up to your TV and hifi. It sorta works.
http://www.Mediagateusa.com

The box has your choice of component, composite, S-Video or DVI video connectors and coax, optical or twin RCA audio connectors. Among that selection should be some combination that can hook it up to what you have in your living room. Unlike having a Media Center PC, it is quiet and doesn’t generate much heat.

You can connect it to your PC via either a regular USB connector, or use either the wired Ethernet or wireless networking ports. It doesn’t come with any hard disk – you’ll need to buy an older model 3.5 inch IDE drive. (It would have been nicer if they included a SATA interface, especially since those drives are pretty cheap now.) After taking off four cover screws, you can quickly connect the IDE drive inside the box and then close it back up, power up and format the drive. There are instructions that are written in badly translated English for various versions of Windows on how to do this.

The good news is that the box has just enough intelligence to handle all sorts of video files that I stored on it. I asked my 20-something stepson to give me a sample of video downloads to try out. One came with German subtitles, one was a version of Babel without any subtitles (which is tough because a lot of dialogs isn’t in English), and one came more or less like the theatrical version. None of these files could immediately play on an ordinary Windows PC without installing further audio or video encoders, such as Divx. They all ran as is just fine on the MediaGate.

The bad news is that the wireless and networking support will take some effort to get working. To use the MediaGate as a network storage device, you need to install a special driver on your Windows PC. It was easier to plug in the USB cable and move the files over to its hard drive, which somewhat defeats the idea behind a network storage box. I have WEP configured on my home network, and I couldn’t get the appropriate key to work with the MediaGate, despite its supposed support for this encryption level.

The unit comes with a small remote control that is used mainly for setup tasks, and for scrolling through the various files to play them. And scroll you will do – the interface is similar to Windows Media Center, showing you folders and file names on screen in large fonts that mean just a few listings per screen. If you have hundreds of files, it will take some effort to find them. Another cool feature is that you can store video and audio files on ordinary USB key drives and then plug them into the unit and play them.

Both audio and video quality seemed acceptable. You have your choice of 4:3 or 16:9 aspect ratios of the video. Overall, the device does a decent job. If you aren’t a fan of Windows Media Center, this might be a good alternative. Apple’s iTV is comparably priced when you factor into the fact that it includes the hard drive but not the cables. But iTV doesn’t do 4:3 and you need to use iTunes to manage how the content gets moved over to the box.

New Year’s Resolutions

Happy New Year everyone, and hope your holidays were relaxing, fun, or at least a break from your working world. I am not a big fan of making resolutions for the new year but one that I made last fall bears repeating: I hope that the coming year you won’t lose any data on the computers that you care about.

I thought I would take some time and describe my own process here at Strom World HQ, in the hopes that this will encourage you to do something similar. You’ll see it isn’t a simple process, and it will take some time to figure out your own strategy. Anyone that claims that making backups is a one-step process isn’t worth listening to.

One up-front caveat, I use a Macintosh as my main business computer, so those of you that use Windows will have to find something similar.

The key to my data backup is to understand how I use my data, and identify the weak points in terms of what happens during what kinds of catastrophes and what particular data is missing as a result.

I had a disk go south on my Mac last fall, and this prompted me to develop my current system. And years ago a nearby office had a fire that didn’t do any damage to my office, other than the firemen breaking down my office door to see that nothing was burning. At the time, I dutifully did backups — on tapes — and had them lying right next to my PC. So I learned the importance of having offsite backups.

The first law of backups according to Strom:

[Backup law #1]: Make the routines simple and not time-consuming, otherwise you won’t do them.

My first line of defense is having two hard disks in my Mac. They are independent disks – meaning that I don’t RAID them or do anything more complex than have them operate side-by-side. I use a piece of software called SuperDuper that costs less than $30.

In the time it took me to write about it, it makes a complete copy of the 300,000 or so files on my main Mac hard drive over to the second hard drive. And it also makes the second hard drive bootable, so if something really goes wrong on the boot drive, I can swap them and be up and running in minutes. (I have tested this too, something that brings us to Strom’s second law:

[Backup law #2]: Make sure to do a few dry test runs, just so you know what to do in case of emergency.

There are numerous stories of people doing backups for years, only to find out that there is nothing on the tapes or disks or whatever media they eventually try to use to restore their data. In the high drama when something goes wrong with your machine, you want to have a clear plan of attack to restore your data. I also check the second hard disk from time to time to make sure that the newest files have been copied over. Doesn’t hurt to check!

If you run with a laptop or if you are tight for space and can’t install twin drives, you can make use of one of the many external hard drives and use SuperDuper to make copies that way – although it will take about twice as long.

I do the SuperDuper backup maybe twice a week, or more if I am doing a lot of writing. That seems to be working well. It is a really nice of piece of software. Those of you that run Windows might want to post some suggestions on my blog at strominator.com for your own recommendations.

But the SuperDuper backups don’t cover the office fire situation. This brings up the next law:

[Backup law #3]: Make sure you have your data stored somewhere offsite.

For this situation I burn CDs and DVDs, and take them offsite. It doesn’t really matter where, just as long as it isn’t near your computer. A year’s worth of my data fits comfortably on a single CD, and these CDs go in a secure place that isn’t in my office. A bank safe deposit box is a good alternative. You just have to remember to bring the new CDs over to it periodically.

How often I burn and what I burn depends on the situation. I try to do them at least once a month. A key part of this strategy is identifying all of your applications data and keeping it in one overall directory to make these backups easier. Some applications, particularly Microsoft Office and Outlook, make this more difficult and squirrel away their data files in some obscure directory, or worse yet, include some configuration information in their program files directory. And the information that you have stored as part of your browser (cookies, passwords, and the like) is also hard to duplicate with a files-based archive.

At the end of the year I burn a DVD with all of my data archive that goes back ten years or so worth of documents. It took me some time to collect all of this information, and I don’t want to lose it. This brings up my final edict:

[Backup law #4]: For information that changes very often, save it somewhere online.

It doesn’t really matter where and how, just as long as it is off your desktop and easily accessible. There are a number of online storage sites, and they all pretty much do the same thing, using a Web browser or Web DAV connection to transfer your files.

Part of the off-site storage that I use is having my main email and contacts information stored with Google’s Gmail. This has been working well for me over the past year, and I love the tagging system that Gmail offers and that they never delete anything and make it relatively easy to find a message. Of course, when I heard about how Google lost a few dozen people’s email information that sent me into a panic.

So if you do use Gmail, you can at least export all of your contacts to a CSV file that you can store on your desktop, in case they loose your data. As to your email archives, you are out of luck here.

Some writers that I know take things a step further and archive their online stories to PDFs. This is helpful, particularly in cases where Web sites go out of business, or suffer link rot, or some other problem. I haven’t gotten this far but could see myself doing this one day. But at least I have my original manuscripts covered.

As you can see, making backups isn’t simple. Take some time to develop the system that will work for you, and then don’t get lazy or lax. When something goes wrong, you’ll thank me for starting off your new year on the right foot.

Caught between computers

There must be something wrong with me this week. For someone who has spent the better part of his career dealing with networked systems, I seem to be caught in between computer networks more often than most people. Or maybe it is just because I am more sensitive to the issues involved? It’s downright spooky.

First there was my Bank of America online account. BofA bought credit card issuer MBNA a while back, and on Monday they finally brought together the two systems, so I can view my card transactions from the same system that has my banking details. I was counting the days, let me tell you. Things don’t much more exciting around here than the chance to see two systems brought together to make my life easier.

Well, so much for anticipation. When I went to pay my bills, I got dumped into a screen telling me how wonderful BofA was going to make my life if I wanted to sign up for their electronic bill presentment system. Trouble is, I already had done a lot of work specifying my payees under the old system, the same payees that were MIA from the screen I was looking at. Harumph.

I fired off an email to BofA support (well, a pseudo-email, because you can’t really communicate with their support over ordinary email, thanks spammers) and got a non-reply reply telling me that I basically was an idiot and asking me to send them tons of useless documentation. So I called them, and after spending 45 minutes on their line waiting and talking to someone that didn’t know anything, I finally got a representative that fessed up that yes, it was them and not me, and yes, the unification of their back-end systems wasn’t going well and it would be a few more days before they fixed things. Just so my time on hold wasn’t a complete waste, I asked that this kindly person communicate to their support department that people like me aren’t crazy and deserve a bit more respect when they debug the bank’s systems for them.

The funny thing is that BofA has me listed in their system as being a customer since the 1980s, when I must have opened an account with some subsidiary that they have since bought and I have since forgotten about. How about that? So is this any way to treat such a long-term customer, I ask you?

Next it was on to Macy’s, which has been busy unifying things on the department store scene. My wife recently bought some furniture and was motivated to open a charge card to get a nice discount. She couldn’t get a new card, because Macy’s claimed that she already had one with one of the department store chains they have since bought. When she tried to open one in my name, she hit a snag with one computer not liking what was being input. Eventually, we sorted it all out, but not while my wife was at the store for several hours. This week I finally got my card, but now we have to chase the discount down. Doubtful, I say.

To top things off, I had to ship something out today via FedEx and I went to their Web site to try to find one of their nearby storefronts. Well, since FedEx bought Kinkos you can’t easily tell what is a shipping storefront and what is a copyshop. And polluting the screen listings are the many places that are basically nothing more than a mailbox on a street corner. If the package that I had was small enough to fit in one of their drop boxes, I would be good. But it wasn’t, and the unified Web site is a real mess to navigate to find the right place.

How hard can it be for FedEx to improve their store listings? People come to their Web site to do two or three simple things. Ironically, FedEx was an early adopter of Web technologies and had a very useable site for far longer than its competitors. Not now, though.

I may start using UPS, they have two locations within a few blocks. And while I would love to switch from BofA, it’s too much trouble, and anyway they got my problem fixed this morning.

I know it is nice that all these companies are expanding, buying out their competitors and making tons of money. But guys, let’s get the basic business integration issues down sooner rather than later. Customers shouldn’t be your beta testers.

Okay, thanks for listening to me vent. You can return to your regularly scheduled programming now.

Top ten ways to secure your SOHO network

Maintaining a more secure small business or home network isn’t an easy task, and even for an experienced IT old hand, it still takes the time and energy to keep things locked down. Computerworld asked me for my top ten most critical steps to keeping your data from ending up elsewhere. All of them don’t take much time and effort to accomplish.

You can read the entire article here.

Get Me Graphics for Vista

The latest news about de-planetizing Pluto has got me bummed. In my misspent youth, the story about how Clyde Tombaugh discovered a planet was one of those moments that steered me towards science and technology, along with watching Mr. Spock fight Tribbles, decoding Clarke’s 2001, and trying out the experiments from Mr. Wizard. While I can understand the decision, it is a lot like telling Columbus that he landed on some Caribbean island instead of the U.S. of A.

Well, let’s not dwell on Pluto but move on to something else to get really depressed about. If you are considering getting more experience with the latest beta of Windows Vista, you will find that your graphics horsepower is woefully inadequate for running this operating system.

I have found from my tests that you will need a discrete graphics processor if you are going to have any kind of productivity with Vista at all. This is probably going to be most noticeable with your laptop, which traditionally has lagged behind desktops in terms of graphics firepower. Why is this important? Vista treats itself like one big video game, with pixel shaders, anti-aliasing, and the like. Everything on the screen is now considered a 3D polygon that can be manipulated by the OS.

While there are some obvious reasons for Microsoft to offer these enhancements as part of its OS, particularly for the gaming generation, there are some non-obvious ones as well. Aero — what the new Windows interface design is called — makes Vista more reliable by separating the screen drawing commands more completely from the applications control. Many of the crashes of XP were caused by this lack of separation, and one application stepping on another one’s screen real estate. The testing that I have done indicates that Vista will help fix these problems. But the fix comes at a high premium.

The wisest course of action is to wait and postpone buying any new graphics card until Vista ships next year. If you can’t wait, then make sure your card has at least 256 MB of on-board video memory, and see what your vendor says about supporting Direct X v10. This is what will guarantee Vista functionality. And if you are making a major PC buy, consider how you will deal with your video subsystem, and think about getting even more video RAM.

Yes, 256 MB of video RAM is going to be the starting place. That is a heck of a lot of RAM for a general business computer, and chances are most of your corporation’s PCs have far less installed.

I tested the hypothesis that having an add-in graphics processor is a necessary condition for running the latest beta 2 of Windows Vista, by testing two identically configured PCs, but one with a plug-in Nvidia GPU and the other using the Intel integrated graphics on the motherboard. I found that without the extra GPU, you are wasting your time and your own productivity. While the experience with an integrated graphics card is acceptable, it is borderline acceptable and most users will become easily frustrated over the limitations imposed by Vista on graphics-poor PCs when trying to run multiple applications. By multiple, I mean more than one: Vista runs a lot of stuff under the covers, much more than XP.

What this means is that users running the on-board Intel graphics will not get the performance and productivity gains that they would have with a discreet graphics card. Intel will try to obfuscate this message in the coming months, and the major PC vendors have already begun plastering “Vista-ready” logos all over their Web sites, but ignore these messages, and find out how much video RAM you can really afford and make sure you get a plug-in card and not anything onboard too.

On a new Dell that I bought about a month ago that was “Vista-ready” it came with a big 8 MB of shared video-RAM. Going into the BIOS, I could see that my choices were keeping this setting, or dropping the video RAM down to 1 MB. Some choice. You might have similar circumstances, if you even know how to fiddle with your BIOS, or download a new one that might help make further adjustments. As a result, Vista runs slowly on this PC, and I don’t see any of the 3D treats that I could have gotten had I installed a better video card.

Microsoft has this mickey-mouse assessment tool that will grade your system and tell you how it is expected to perform with Vista: don’t even bother with the download, because it is easy to game this tool and have it report just about anything.

I’ll have more to say about Vista in the coming months, but you might as well know the bad news now about the add-in graphics scene as you try to console yourselves about the whole Pluto thing.

How to set up WPA2 on your wireless network

If you are like most people, your home or small office wireless router probably is running without any encryption whatsoever, and you are a sitting duck for someone to easily view your network traffic.

Some of you have put encryption on your wireless networks but aren’t using the best wireless security methods. This means that you are running your networks with inferior protocols that offer a false sense of protection because these protocols are very easily broken into.

The best encryption method is to use WPA2. This is slowly being supported on a number of wireless devices, and the latest incarnations of both Wndows XP and Mac OS X include support too.

Read my tutorial on how to setup WPA2 in Computerworld here.

Online Storage Solutions

I am a big fan of backups ever since my office building had a fire one afternoon. An electrical short in the office directly below mine caused the fire, and I fortunately wasn’t there when the fire started – I had left the office to run an errand. But when I came back and saw the fire engines circled around my office building, my heart skipped a beat. Yes, I had done plenty of backups of my data, indeed just that morning I had made one. And my backup was sitting right next to the computer on my desk! A lot of good that was going to do me now, to be sure. Fortunately, nothing in my office was harmed, and I learned a valuable lesson.

The first rule of backups: make a habit of taking them to another location.

With the advances of broadband and better Web services technologies, you now have a lot of different choices when it comes to storing your data offsite. Depending on how much money you want to spend, you can accomplish this with pennies, a few or hundreds of dollars. Let’s start with the pennies first.

The simplest strategy is to burn CDs or DVDs with your data, and take them somewhere else on a regular basis. When you have accumulated a bunch of discs, though, this gets somewhat difficult to manage. One of my readers has a great take on this: “You can use my rule of thumb: Whenever you visit your mom, take a new backup and keep it at her house. If she nags you about not seeing her often enough, it means your backup is not
up-to-date!”

Next up in the cost curve is using one of the resellers of Amazon’s S3 storage API. The two services that I tried are ElephantDrive.com and JungleDisk.com. They use small applications that communicate with Amazon’s storage repository. They both encrypt and then move your data up into cyberspace. The downside is that Jungle doesn’t currently offer synchronization with your data on your hard disk, so you have to do some work to manage the updates on your own — although they have promised to be working on a solution. They are also fairly slow at sending the files up to Amazon — 150 MB took 45 minutes for one service, and two hours for the other.

The two Amazon services are dirt cheap — you will be hard pressed to spend more than $10 a year for 2 GB of data. Amazon has published their storage API and we can expect more players to enter this space. Jungle handles both Macs and Windows, while Elephant is Windows-only. Both are small start-ups, which may be an issue because who knows how long they will be around. I want to save my data someplace and then the company disappears. On the other hand, “We have released all our source code on how we do the encryption and how we store the information on S3,” says Jungle Dave Wright, the head guy there. “Users can feel confident that there are other tools where they can get copies of their data off of S3.” And at these prices this can be just one of many places that holds your data.

A better solution on the synchronization scene is from FolderShare.com, which used to be owned by Iomega but is now part of the Redmond Borg. You install the software on two PCs, and they work on both Mac and Windows. Any files in a specific directory that you designate that gets saved on one gets copied to the other. It is presently free, too.

One company that has been around for a while in this space is MyDocsOnline.com. They offer a confusing array of pricing plans, but to use their backup service you get 5 GB for about $100/year. They are competing with Xdrive, which is owned now by AOL and will be offering 4 GB for free starting next month. Another company here in the mid-price bracket is Box.net, who offers 4 GB for about $60/year. Some other solutions offered by my readers include Datadepositbox.com, carbonite.com, and LogMeIn. These are all-Windows solutions as far as I know at present.

I have been using MyDocs for years, and like the system. Uploads are fast, and they also support WebDAV, which makes it easier to mount the server on your desktop and save files to it. Synchronization isn’t as much of an issue if you can just delete and replace the entire data store with a clean set of files, which is what I do.

At the top end of the market are companies like eVault and Iron Mountain that offer online storage for larger enterprises. These typically start around $100 a month for 5GB of storage. Apple also has its .Mac offering too. It costs $200 a year for 4 GB of storage.

No matter which service you use, start to do something today about saving offsite copies of your data. Don’t wait for a fire or other disaster to get going on this.

WiFi Interference

While we are trying to stay cool this week, I had some time to do some wireless tests at the Strom world HQ. And when I started to add up all the wireless stuff that I have beaming radio waves around here, I was impressed that everything just sorta works, given all the crossed signals.

Let’s make a list, shall we, of the wireless stuff that is in a typical house: besides the wireless LAN, there are Bluetooth headsets, wireless mice and keyboards that may be running on infrared or some other signal, remote controls to the TV and stereo, cordless phones, microwave ovens, and let’s not forget garage door openers too. Heck, my wife just bought a fan yesterday that had a wireless remote control, which got me thinking about all this wireless stuff out there.

A nice article that covers some of the basics with dealing with interference can be found here by Network Computing’s Jameson Blandford.

If you are in the market for a new cordless phone, make sure you get one that operates at the higher 5 GHz frequencies if you can — you will get a clearer signal and stay out of the WiFi band that your network operates on. That is, unless you have an 11a network.

What about the wireless LAN — that has its own idiosyncrasies. Most of you are probably not aware that the current crop of wireless LANs operates on three non-overlapping frequencies out of all 11 frequencies that are available — channels 1, 6, and 11. It is best to do a site survey and figure out if you can move your own access point/router to some other frequency that will have better reception. In my case, I had plenty of neighbors who were using channels 1 and 6, but only one on 11.

How do you do a site survey? If you have the Canary WiFi spotter, you can do it easily in about two minutes, because this handy device tells you immediately what channel everything is talking on. And you can move from one side of your house to the other easily and see how the signals change. Failing that, you can bring up your laptop’s wireless control panel and root around for a while seeing if you can make sense of the information it is telling you while walking around your house.

Once you get the frequency, you will have to dig out your wireless router’s manual and figure out how to connect to its management interface and make the change to its parameters on the right page. While you are on this page, if you have a wireless router that operates on more than one protocol, such as a b/g or a/b router, now is the time to turn off the ones that your computers don’t use. If you have those 5 GHz phones, then turn off the 11a signals and you will get better reception, provided none of your laptops are running 11a cards. Got that?

Finally, there is Bluetooth, which operates at near the same frequencies as our wireless 11b/g networks in the 2 GHz ranges, along with microwave ovens. Now, most of us aren’t on our Bluetooth headsets all that long anyway, but some new ones are coming out there that can do double duty — such as answer calls from our desk phones and also be used to connect to our PC audio to listen to music or make VOIP calls. I tried out the GN Netcom GN9350 and it worked well, even on my Mac that isn’t promised in its manual. The one downside is there are a lot more wires on my desk, and the unit doesn’t do caller ID on its futuristic base station. I could roam all over the house with this unit, which is very nice, but I wonder how my wireless LAN connectivity is affected. Blandford’s article talks about a 10% hit in his tests.

Have fun with your wireless networks, and enjoy your summer.

Let us now praise vacuum tube radios

Mike Pusateri's Cruftbox is one of my guilty pleasures. Mike runs an IT shop at Disney TV in beautiful downtown Burbank and has a wonderful blog where he takes apart and puts together all kinds of stuff. He is one of these people that are fascinated with the world around him, and has a very drole sense of humor that makes reading his posts fun too.

Here is a post about an old vacuum tube radio that he found at a garage sale. For those of you that are too young to appreciate this, vacuum tubes were the things that did what ICs do now, only using a heap more power and a heckuva lot bigger. What really made this post for me wasn't just that he picked up an old radio, and that it worked just fine, but that the radio included a schematic diagram of its components. They did that back in the day, you know — included the docs as part of the overall package. Something that our current electronics vendors could learn from.