ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • We're About To Top $11,000 For Mac
    카테고리 없음 2020. 2. 7. 20:42

    $999 for AppleCare+:eek: Never thought I'd ever see that. But it's in line with how much the Edition costs. If you can afford $10,000 to $17,000 then chances are you can afford $11,000 to $18,000. Producer Jason Blum (The Purge, Paranormal Activity) has changed the financial model of Hollywood by stripping out high salaries and actors' perks, and created the 'Moneyball' of the film industry.

    Photo: Apple It's official! As we reported yesterday morning, is now officially on sale in the United States, starting with the two base models—the 8-core and 10-core variations—and a price tag of $5,000 for the 'standard configuration.' Up until now, that's the only price we knew for this behemoth of a computer—what Apple calls 'the most powerful Mac ever'—but now that the iMac Pro website is official and the product is up in the store, we can find out how much a fully loaded version costs. And let's just say you should hold on to your wallets. If you, going all the way up to the 2.3GHz 18-core Intel Xeon W Processor with Turbo Boost up to 4.3GHz, 128GB of 2666MHz DDR4 ECC memory, a 4TB SSD, and a Radeon Pro Vega 64 graphics card with 16GB of its own HBM2 memory, your price tag goes up to.

    Drum roll please. $13,200 If you do one of the more basic configurations, you'll be able to get yours in the next 1-2 weeks. However, if you plan to shell out the aforementioned $13,200 for the fully-loaded 18-core beast, don't expect to get the computer in 2017. According to the Apple store website, a fully loaded variation will ship in 6-8 weeks. To learn more or configure your own (if you're lucky enough to have pockets this deep) head over to the. It seems somebody's simplistic reasoning is that a 2017 iMac Pro will use the same CPU chips as a 2013 Mac Pro. Intel keeps improving their processor architectures, and current chips are therefore likely to be higher-performing than those of a 2013-generation machine.

    The 18-core iMac Pro likely uses the new Xeon W-2195: '. In essence, these are Xeon versions of the current Skylake-X (Core i9) processors with all the pro features enabled, such as the extended memory support, vPro, Intel’s AMT, and the standard enterprise Reliability, Serviceability and Availability (RAS) features' Thus, the iMac Pro will use 'enterprise' CPUs with performance similar to the Core i9, the major difference being that the RAM contents will be protected with ECC, which is a feature not available on desktop CPUs like the Core i9.

    It seems somebody's simplistic reasoning is that a 2017 iMac Pro will use the same CPU chips as a 2013 Mac Pr Are you sure? Do not count yourself as everybody. Nobody said it is the same. The comparison back then was of i7s with Xeons then.

    Nobody anywhere assumed that both i7 and Xeon are the same. However, you are totally wrong that it is i9. Most critically Xeon has way worse clock speed and is not open to overclocking either.

    That also explains why i9 has higher TDP than Xeon. Apart from other differences. Nobody apart from you makes that assumption.

    Nevertheless that hardly changes the fact that the choice of hardware in Mac Pro is just wrong for the target audience of this site. Photographers are better off with fewer cores that max out clock speed. Videographers are better off with i9 that give you same cores as Xeon but at higher clock speeds.

    Let me remind you that you are the person who made this amusing assertion: 'gap has only widened since with newer i7 and i9 CPUs compared to Xeons' showing that you didn't even know that the new Xeons have basically the same architecture as the i9. For chips with the same number of cores — e.g. 10-core, which may be the sweet spot in price/performance ratio for the iMac Pro — the Xeon and i9 have quite similar clock speeds. As for 'overclocking', you don't seem to be aware that this kind of computer tinkering is frowned upon in professional environments, and is also likely to affect data integrity and system reliability.

    Concerning reliability, I also note with amusement your assertion that 'You are better off with NVMe RAID 0'. Tou are apparently not aware that with RAID 0, a failure of even one device in the array will lead to data loss. Assume a RAID 0 system is striped across N devices.

    Compare this to a lottery. Are you more likely to score a win (a system error) if you have N lottery tickets than only one? I guess your, well, idiosyncratic approach to system reliability also meshes quite well with your stated attitude that 'I haven't seen problems with non-ECC RAM in my photo and video processing, so I can't conceive that there could be environments where memory errors could be an issue'.

    Anyway, thank you for your 'competent' (as expected) input. @Antisthenes I was 100% right. The gap has widened.

    the sweet spot in price/performance ratio for the iMac Pro. Xeon and i9 have quite similar clock speeds 1) Sweet spot is higher - around 14 cores. As long as you forget about Mac Pro since i9 is not available. 2) You are wrong. The clock speeds are higher on i9 than on Xeons. Both base clock and multi-core turbo (which is different from Intel's stated Turbo that is stated for single core).

    As for 'overclocking', you don't seem to be aware that this kind of computer tinkering is frowned upon in professional environments Wrong. You can find specialized shops that do such builds precisely for professionals and make sure they are stable.

    apparently not aware that with RAID 0, a failure of even one device in the array will lead to data loss What made you think I am not. Such RAID 0 is not for back ups or data storage where you need redundancy. It is for sole speed. It is for fast intermediate raw processing - like cache.

    There is no bother if that RAID fails. You do not lose anything important.

    I ran it this way always (even in HDD era). And I have RAID 6 NASes (that are paired and synced for backup and in different locations) for back ups. Different applications warrant different types of RAID. You really seem to have no clue on what makes sense where judging by your comments. I can't conceive that there could be environments where memory errors could be an issue What total rubbish. You can't probably but I even gave you an example where it makes sense - nuclear power stations.

    We're About To Top $11 000 For Mac Free

    Not in video and photo processing. I am sorry that you can't read straight and think straight. @kaxi85 It does not have to be the same spec. Mac Pro has really poor choice of technology.

    You can get better performance in Adobe Premier Pro from i7-8700K than from these Xeons and i7-8700K is way cheaper. Or you can go for i9 if you need ultimate performance - it will be way better than Xeons and better than i7-8700K.

    You do not need ECC memory for neither photography nor videography. Especially if it comes at the cost of poor CPU performance as it does with Xeons. You are better off with NVMe RAID 0 and with better GPU. So you can build way cheaper and faster PC than this Mac Pro. Dated reference (gap has only widened since with newer i7 and i9 CPUs compared to Xeons). All the talk about not being able to add extra video cards to this machine ignores advances that have been made both in hardware and software. This machine has 4 Thunderbolt 3 ports.

    Add whatever card(s) you want as eGpu(s). Works seamlessly, recognised by the OS. I have an RX480 connected to my Mac Mini, all my output goes through it, and I can upgrade it to whatever I want in the future. I believe Apple will release their own eGpu products in the next year or two, but they'll be priced at a premium. You can do it now quite easily and affordably. The Apple iMacs has never been gaming computers. You can run some games, even top ones in the release time.

    But they have never been designed for kids, teens and gamers! The GPU in these computers is meant to be for consumption and production of multimedia other than latest games. And you raise the important point, the external GPU cards that you can have is now the thing when you are doing processing requiring work like CAD or 3D modeling in such level that extreme processing is required, that you really have somewhere else a server where you task your processing, like Amazon HPC (High Processing Computing). The GPU in this thing is more than enough to run what ever you really need. Now look at the Microsoft Surface Studio.

    That is a joke! Beautiful design, looks great, nice CPU, and then they throwed a Intel GPU that barely can run the display itself!

    Apple has no such problem in this case! Have you really really tried using external GPU? Apple has deserted Mac mini as well (compared to bunch of powerful SPFF systems on win/linux side) and it still has outdated TB2 port with pitiful bandwidth. TB3 has bit better bandwidth but it's still x4, way lower than full PCIe x16 bandwidth of 'normal' PCIe slots. Do you know how much are TB3 eGPU cases? I had one, and it cost me $300 for just case alone.

    You need to plug in additional power cable to it, making the whole combo a tangled mess. Why do need to sacrifice bandwidth, spend extra money, add bunch of cables for 'desktop' PC? Isn't it infinitely better to just put in the main box and forget about it? Didn't need it for years, then I did. Didn't have enough for a whole new system, so found solutions that fit the budget and my needs (I bought a new camera and lens rather than a new computer). Hackintosh would be another way to go, but don't like the idea of being stuck with old OS or Apple deciding to shut the door on Hackintoshes at some point and that being out of my control. Works for me.

    'Isnt the whole point of all in one system tideness?' I don't have an all in one system. I have a Mac Mini & an Eizo monitor. Historically that is not true. Without adapters many old lenses are orphaned (e.g. Canon FD) and lenses are only one issue.

    You didn't address the performance issues, you can't upgrade e.g. A Sony A7 with the A7RII processor, you have to buy a new camera and that is typical and almost universal. Also eGPUs attached via Thunderbolt do not suffer any significant degradation. Finally you can upgrade some components, it just won't be doable by someone in their basement it will only be done by certified professionals. As anyone with an iMac knows it’s not silent when it’s running all-out. Fans do kick in.

    It’s not too noisy either, though. Apple takes a mature approach to cooling by not cooling parts too much until it has to. You don’t need to cool most GPUs until their temperature reaches 85C. Same with processors, except there the temperature is even higher.

    Those are the operating conditions in which these parts were designed to operate indefinitely. The benefit of cooling things adaptively, only when they are hot is the thermal gradient is high and you need much less air (and therefore noise) to reduce temperature by a degree. As long as your cooling system is capable of keeping the components within the temperature ranges outlined in their specs, everything will be fine. It doesn’t: that’s how the specs are for semiconductors.

    If it says the chip can run at a given temperature, you can design to run it at or below that temperature, no ifs or buts. And that’s nowhere near the limit. Limit for the CPU internal core temperature 100C.

    This is the only relevant temperature for cooling: one that the CPU measured and reports itself. This is not where your CPU will die or shutdown or anything. This is where it begins to throttle. Limit (for Intel) for the case temperature is 72C. The iMacs does work well in heavy load and can start to do some throttling, but not at all so bad as Microsoft did with their top Surface Pro Book 2/Studio and so on. Those are designed to look pretty and be in normal use, but using even a Adobe products on those for normal 20Mpix files touching can make their fans kick in fully and then you get the CPU throttling, serious one. The Book 2 is even so bad that the power supply that comes with it, can't keep the charge when you are gaming!

    They designed a beautiful great laptop for gaming, and then you can't game with it as your battery runs out after an hour! And even if you play 15min, your CPU and GPU are so hot that you get like 50-60% performance out of it as it throttles all down! Surface products ARE for power usage!

    That is the whole PURPOSE of those! Microsoft throws in best possible CPU and GPU (like Nvidia 1060) they can place to smallest possible package, and then target it for professional heavy task use! They are NOT for secretaries or average joes web browsing and facebook.

    The problem with all these things is their design for small and neat. That is why the workstation computers ARE WORKSTATION computers, meaning they have towers for large space for cooling, large components and so on. You can throw a huge amount of processing capability to a tiny laptop like Surface Book 2, but you have a TINY case, TINY space and all the problems thermal laws explains. You just can't get around the problems. But when the major use is really that you rarely need full performance, it ain't a problem for most! Gaming is one of the most requiring task you can do as you are maxin CPU, RAM, GPU and VRAM performance. MIcrosoft Surface is a hybrid.

    Not a tablet, not a laptop. That is the problem with the Microsoft Surface, you can't use it at all as comfortably from lap like you can use a laptop, you can't use it as comfortable on hand or lap like you can a tablet. It is a compromise with the flaws on every situation you use it but writing on table, and even then it has flaws as it requires larger footprint from table than a laptop does, it is more difficult to get wanted angle because no hinge but a support leg. It is nice, light and small computer to carry with you as a magazine.

    We

    It is very nice to write by its ultra thin keyboard cover on table (compared to ANY laptop keyboard) and you have backlight keys and nice well working touchpad (that first 3gen were little too small). This comment is as well written on Microsoft Surface Pro, and I use it is for a four things really.

    1) Editing images as it has good pen with it. 2) Web browsing. 3) Watching videos. 4) Document writing on table. They use Samsung NAND flash chips, but not their SSDs.

    All SSD components are soldered to the board in any current Apple product. And that’s completely beside the point: those components do not heat up much on their own, and they aren’t in the path of hot air coming off of other components.

    As a vertically integrated manufacturer Apple does care about quality of its products. If something fails, you just bring it to the Apple store and make it their problem. Unlike most other companies, they don’t get to point fingers. The worst part of the iMac has always been – and obviously continues to be – the glaring screen, hard if not impossible to calibrate and not showing 100% of Adobe RGB color gamut. This alone makes it a „no-go“ for serious photographers – unless they are additionally ready to spend another couple of thousand for a decent EIZO screen.

    It is such a pity that Apple more and more turns its back on the creative community (where it came from). It would be so easy for Apple to cooperate with Eizo and produce a real photographers‘ iMac. I miss Steve. I know that film is analog and probably 'irrelevant' to digital camera users, but for photographers who care about reproducing the colors that could be recorded on color film, DCI-P3 isn't a bad choice at all: '. DCI-P3.

    This is a newer standard for color response with digital cinema projection, and was designed to closely match the full gamut of color film.' Pointer's gamut is an extensive sample of the colors of reflective surfaces that exist in the.real world. The TFTcentral gamut charts show that AdobeRGB has a better coverage of the saturard greens, cyans and blues in Pointer's gamut than DCI-P3: The same charts show that DCI-P3 has a better coverage than AdobeRGB of saturated yellows, oranges, reds and purples. In fact, it's likely that DCI-P3 actually has a better coverage, in percent terms, of Pointer's gamut of real-world colors than AdobeRGB. Some people, who have only a fragmentary understanding of color managment, mistakenly think that DCI-P3 devices cannot properly display a picture file that has been encoded in another color space, like AdobeRGB. This is, of course, incorrect, as any AdobeRGB color that is also included in the DCI-P3 gamut will be correctly displayed on a DCI-P3 screen with an operating system that implements proper color management — e.g. Mac OS X — and a color management-aware application — e.g.

    Photoshop, the Safari, Firefox and Google Chrome web browsers etc. Conversely, of course, any color in the AdobeRGB gamut that is included in the DCI-P3 gamut, and happens to be encoded in a DCI-P3 picture file, can be correctly displayed on a AdobeRGB display connected to a properly color-managed computing environment. ProPhoto aka ROMM RGP is all that you need: Forget the Adobe RGB and DCI-P3, if you are serious of course. You will NOT see the difference really, and it is almost always waste of time for normal work compared to sRGB as most devices out there are in sRGB and it already matches your vision very well, and even most people with calibrated displays are not working in color calibrated rooms here lights, walls and all are painted and adjusted for middle gray so only color you see is the photo/video on screen as everything is just middle gray (no wallpapers on displays, all GUI elements are middle gray etc). And then every viewer after that has what ever random things. Reason to use ROMM RGB is that when you are adjusting colors heavily (like past 1-2 stops) you have far more computing flexibility avoid posterization for smooth tones and gradation.

    And in a few years, with the machine still working just fine, Apple will arbitrarily say 'you can't upgrade the OS on this machine.' As they have with my 64-bit, 12/24 core, 3 GHZ, 64GB, 2tb/2tb 2009 Mac Pro. So that's about 8 years before new/upgrade application software starts to become incompatible with your $13,200 machine due to using features in the OS that are denied to you by Apple. If you take that $13,200 over 8 years, you're paying $1650 per year, or $137.50 a month. Some might find those numbers interesting. 'you can't upgrade the OS on this machine.' I'm typing this on a 2009 iMac, running the current version of macOS.

    I've upgraded its RAM, and replaced the optical drive with a fast SSD. Newer models may not be as easy to upgrade, but it's still possible to do so. And a Mac still offers enormous advantages over any Windows PC, including far better security, stability, and resale value. Unless you're a hardcore gamer or deeply committed to the Microsoft enterprise hegemony, you're likely better off with a Mac. Recent studies have also demonstrated that Macs are more cost-effective in businesses, including one run by IBM, of all companies. The Macs' total cost of ownership over several years was significant lower than the Windows PC's, and they required far less help from the technical support staff.

    I do have a 27' apple ( the one before retina ) and previous to my 27', I had a 21' that I gave it to my son ( he is in college now ). I went with apple after using PC windows all my life and I cannot denied they are superb, very secure and because I am a photographer, I needed something like this. I also have a PC windows that I use exclusively for gaming but for making payments, emails, and to manage sensitive stuff, my apple is my way to go. Even if I had the money to buy this new machine, I won't buy it. I can stay with my apple for another 5-7 years and by then, that price will go down. Price any comparable machine and you will find that the cost is about the same. The new HP Z workstations with a comparable (thought lesser) configuration are about the same price.

    The Microsoft Surface Design Pro with only an i7 starts a 4,000. I don't think that a person buying a Lamborghini would say they are greedy for creating a high performance machine. The only issue is that you are locked into the configuration. If you need a high performance machine, this, or the future release of the Mac Pro will be for you.

    We're About To Top $11 000 For Macbook Pro

    Yes, I agree that it is absurd. For us that we work 40 hours per week and have a 30-35k salary or perhaps 40k so we cannot afford to buy it. However, this machine I believe has been developed for very rich and millionaires. For them, the price will not be absurd and even if they do not need it for business, they will buy it so they will continue look different to the rest and majority. Soon, in less than 6 months, we will have the capability to upgrade our windows PC to the new technology that will match or supersede this Apple machine for much less money that this car price.

    'I don't think you can get that with a 'spare parts' company.' The one-year warranty? If you snapped a computer together yourself.

    Intel gives a 3-year warranty with Xeon processors. Some RAM companies give lifetime warranties. A Vega 64 graphic card has a 2-year warranty. SSD hard drive, 3 years.

    Motherboards, 3 years depending on brand. Take all of these parts, stick them in an Apple case, and the warranties are all reduced to 1-year.

    That doesn't speak well for Apple's confidence in their product design. Rather than all this raw power, I would just like to see already-existing Apple technology like the Apple Pencil integrated into the pro Mac line, similar to the MS Surface Studio. 2D creators simply don't need all this grunt in the iMac Pro, and while I think it's not been a large enough market so Apple has always been happy to leave Wacom to meet these users' needs, I do think the Apple Pencil is actually much better than the Wacom pens so even with Wacom's impressive new 4K Cintiqs coming next month, I'd probably rather still have an actual 5K Mac 'Surface Studio' simply because of the Pencil.

    I'm generally familiar with what an actual Radeon Pro Vega 64 looks like; it comes in it's own liquid-cooled case, offers 4 HDMI ports, pulls almost 500 watts, and can easily be swapped for another in under 30 seconds. Obviously a real Radeon Vega isn't going to fit into this all-in-one computer, which happens to have 0 ports for external monitors. Makes me wonder what sort of non-standard / proprietary / customized graphics this thing actually has. I also can't help but notice there is no audio port (other than a mini-headphone plug). No audio in, and no audio out.

    Aren't these machines supposed to be built for 'pro' use? The iMac Pro is positioned in the market as a workstation, not a desktop. If it was built as a traditional modular tower with expansion room, nobody would so much as blink. But it isn't.

    It's an appliance without a single user serviceable part. Hence the raised eyebrows. As far as ECC RAM is concerned; I'm just going to throw this out here: And one more thing. If you look at Dell and HP workstation, you'll notice they come with ECC RAM.

    Boxx and Puget have it in the Xeon equipped products. Maybe they and their customers know something you don't:-). People pay even more for computers when computing power translates into revenue.

    But you are right, this is still just a desktop class computer, primarily for photographers and video editors. Yet Apple chose to put a server-class processor, ECC memory (why would you do that in this class of computer?) etc. I assume this is mostly in order to achieve high price and thus margins. Due to its physical architecture, it will clearly not be able to handle continuous high computing loads, there is simply nowhere to put all that heat. People running deep learning, for instance, regularly put 2-3 graphics cards in a large box, with water cooling, large fans etc.

    We

    Lotzi You make some interesting points for those of us who need to buy a computer in the near future but are not computer experts. Could you explain further? Why wouldn't someone working on video editing or churning through a thousand photos a day NOT need error-correcting memory? Or is that not what it's for? And why is a server class processor overkill?

    For

    Again, speaking from total ignorance.reality TV editing is done by huge teams pushing data back and forth all day, and a typical film can have 100 or 200 TB of data. Or is that not what a server-class processor does.

    @MrBrightSide: For 3D rendering work it's best to have more than one GPU - the more cores the faster the render completes. And if one GPU goes wrong, you just take it out and continue working until the replacement arrives. This Apple toy is limited to one GPU because they've crammed it into a tiny case.

    Next to a very powerful CPU, it / both of them will undoubtedly be throttling under heavy load to stop themselves going up in flames. There's a good reason why high end workstation PCs and gaming rigs have lots of fans and water cooling - because they need them. Many years ago, there were studies in how to sell audio gear. If a company wanted to sell lots of units that sold for $1,000, the best course of action was to introduce a higher end version of the same thing that cost, say $1,800. The general public would ohh and ahh over the highest priced/spec'ed version, declare it overpriced, and then pick up the model one step down with the more reasonable price point, and the manufacturers' desired profit point.

    I look at some of the highest spec'd computers the same way, as a way to draw attention to the brand and its' latest offerings. All the comments here indicate that older strategies can still be effective. I'm sure Apple will sell some of the most tricked out version of these computers, but they are a mass market electronics firm at this point. They will make the bulk of profit from many sales of more mid-priced versions.

    I've spent around $900 in 1999 (or was it 2000) on building a Pentium III machine - 600MHz, 64MB SDRAM, 20GB HDD, 3dfx graphics and a CD-ROM drive!:) A year later - additional 128MB SDRAM, CD-RW drive, 3dfx went out of business and I was able to get their last gen 32MB card for just $70. Two years after that I built a new PC:). 2GHz Athlon, 512MB DDR400, 80GB SATA HDD, 128MB GeForce4 Ti. Night and day difference. I paid less than $800 for it. I used it for 8 (eight!) years:) (made some cheap upgrades during that time, like fully loaded 3x1GB DDR400, 2x80GB RAID0 + 160GB) and then sold it for $150. Now I'm 'sitting' on an 'ancient' i7 WS and it's like nothing new out there anymore.

    We're About To Top $11 000 For Mac 2017

    Is it me getting old, or did the Silicon Valley run out of silicon? $15,000!?:D Man. Apple robbed you.

Designed by Tistory.